{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/data-model"},"x-facet":{"type":"skill","slug":"data-model","display":"Data Model","count":100},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3d59b8b7-a4f"},"title":"Salesforce Business Analyst","description":"<p><strong>Job Description</strong></p>\n<p>Gather, analyse, and document business requirements for Salesforce Automotive Cloud solutions supporting dealer, sales, and customer lifecycle processes.</p>\n<p>Translate business needs into clear functional specifications, user stories, and acceptance criteria.</p>\n<p>Collaborate with developers, architects, and product owners to design scalable Automotive Cloud solutions.</p>\n<p>Analyse and optimise business processes related to vehicle sales, dealer management, and customer engagement.</p>\n<p>Support testing activities, including functional testing and User Acceptance Testing (UAT) coordination.</p>\n<p>Contribute to Salesforce configuration activities, including Flows and process automation.</p>\n<p>Support data analysis, reporting requirements, and customer and vehicle data management within Salesforce.</p>\n<p>Maintain clear documentation of requirements, processes, and implemented solutions.</p>\n<p><strong>Qualifications</strong></p>\n<p><strong>Must-haves</strong></p>\n<p>Proven experience as a Salesforce Business Analyst in CRM or digital transformation projects.</p>\n<p>Strong experience in requirements engineering, including requirements gathering, documentation, and analysis.</p>\n<p>Hands-on experience with Salesforce configuration, including objects, fields, validation rules, and basic administration.</p>\n<p>Experience building and maintaining Salesforce Flows for business process automation.</p>\n<p>Knowledge of Salesforce Automotive Cloud or strong experience with Sales Cloud and/or Service Cloud.</p>\n<p>Experience translating business requirements into user stories and functional specifications.</p>\n<p>Experience working in Agile environments (preferably SAFe).</p>\n<p>Strong stakeholder management and workshop facilitation skills.</p>\n<p>Experience supporting testing activities, including functional testing and UAT processes.</p>\n<p>Good understanding of CRM data models and customer lifecycle management processes.</p>\n<p><strong>Nice-to-haves</strong></p>\n<p>Salesforce certifications (e.g., Salesforce Business Analyst, Salesforce Administrator, Sales Cloud or Service Cloud Consultant).</p>\n<p>Experience with Automotive Cloud data models, including vehicle, driver, dealer, and service lifecycle objects.</p>\n<p>Familiarity with Salesforce integrations and APIs.</p>\n<p>Experience in the automotive industry or within OEM/dealer network environments.</p>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Full-time position with 40 hours of work per week.</li>\n<li>27 vacation days.</li>\n<li>Unlimited employment contract.</li>\n<li>Flexible working hours.</li>\n<li>Opportunity to work in a dynamic and innovative environment.</li>\n</ul>\n<p><strong>How to Apply</strong></p>\n<p>If you are interested in this opportunity, please submit your application through our website.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3d59b8b7-a4f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"MHP","sameAs":"http://www.mhp.com/","logo":"https://logos.yubhub.co/mhp.com.png"},"x-apply-url":"https://jobs.porsche.com/index.php?ac=jobad&id=20057","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Salesforce Business Analyst","Requirements Engineering","Salesforce Configuration","Salesforce Flows","Agile Environments","Stakeholder Management","Workshop Facilitation","CRM Data Models","Customer Lifecycle Management"],"x-skills-preferred":["Salesforce Certifications","Automotive Cloud Data Models","Salesforce Integrations","APIs","Automotive Industry","OEM/Dealer Network Environments"],"datePosted":"2026-04-22T17:26:47.415Z","employmentType":"FULL_TIME","occupationalCategory":"IT","industry":"Consulting","skills":"Salesforce Business Analyst, Requirements Engineering, Salesforce Configuration, Salesforce Flows, Agile Environments, Stakeholder Management, Workshop Facilitation, CRM Data Models, Customer Lifecycle Management, Salesforce Certifications, Automotive Cloud Data Models, Salesforce Integrations, APIs, Automotive Industry, OEM/Dealer Network Environments"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a7f761de-2d8"},"title":"Software Developer – Microsoft Power Platform & Dynamics 365","description":"<p>As a Software Developer – Microsoft Power Platform &amp; Dynamics 365 at Porsche Engineering Romania, you will play a key role in the company&#39;s digitalization transformation. Working in a SAFe environment, you will deliver business applications and process automation using the Microsoft Power Platform and Dynamics 365, enabling scalable, data-driven, and AI-ready solutions.</p>\n<p>Your responsibilities will include designing, developing, and maintaining business solutions using the Microsoft Power Platform (Power Apps, Power Automate, Power BI, Dataverse). You will build and enhance both model-driven and canvas Power Apps that align with business requirements. Additionally, you will develop and configure Dynamics 365 solutions, working with out-of-the-box capabilities as well as custom extensions.</p>\n<p>You will also implement business process automation with Power Automate, including integrations with D365 and external systems. Furthermore, you will design and maintain Power BI dashboards and reports that enable data-driven decision making. You will support data and document migration activities for Dynamics 365 and Dataverse.</p>\n<p>You will participate in requirements workshops and translate business needs into technical solutions. You will collaborate with cross-functional and international teams in a SAFe/Agile environment. You will ensure solution quality, security, performance, and maintainability by following best practices.</p>\n<p>You will provide technical input, documentation, and support throughout the solution lifecycle. You will act as a trusted technical contributor for customers and internal stakeholders.</p>\n<p>To succeed in this role, you will need to have successfully completed a Bachelor&#39;s or Master&#39;s degree in Information Technology or an equivalent education. You will also need to have at least 2–5 years of experience developing solutions with Microsoft Power Platform and/or Dynamics 365.</p>\n<p>You will need to have hands-on experience with Power Apps (model-driven and/or canvas), Power Automate, Dataverse, Power BI, and Power Virtual Agents. You will also need to be experienced in configuring and customizing Dynamics 365, whether on-premises or in the cloud.</p>\n<p>You will need to have a solid understanding of ETL concepts, data modeling, and system integrations. You will also need to have worked with Dynamics 365 integrations and data exchange between systems.</p>\n<p>You will need to be familiar with licensing concepts for both the Power Platform and Dynamics 365. You will also need to be able to explain technical solutions in clear, business-friendly language.</p>\n<p>You will demonstrate strong analytical, problem-solving, and organizational skills. You will speak English fluently, and German is considered an advantage.</p>\n<p>You will communicate effectively, collaborate well in teams, and have a strong commitment to delivering high-quality engineering services to customers.</p>\n<p>You are willing to travel when required.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a7f761de-2d8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Porsche Engineering Services GmbH","sameAs":"https://jobs.porsche.com","logo":"https://logos.yubhub.co/jobs.porsche.com.png"},"x-apply-url":"https://jobs.porsche.com/index.php?ac=jobad&id=20059","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Microsoft Power Platform","Dynamics 365","Power Apps","Power Automate","Power BI","Dataverse","ETL concepts","data modeling","system integrations"],"x-skills-preferred":[],"datePosted":"2026-04-22T17:25:42.590Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Cluj"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Automotive","skills":"Microsoft Power Platform, Dynamics 365, Power Apps, Power Automate, Power BI, Dataverse, ETL concepts, data modeling, system integrations"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ad0816b8-15c"},"title":"Software Developer – Microsoft Power Platform & Dynamics 365","description":"<p>The Software Developer – Microsoft Power Platform &amp; Dynamics 365 role at Porsche Engineering Romania plays a key role in the company&#39;s digitalization transformation. Working in a SAFe environment, the role delivers business applications and process automation using the Microsoft Power Platform and Dynamics 365, enabling scalable, data-driven, and AI-ready solutions. By translating business requirements into high-quality digital solutions, the role supports efficient processes, system integration, and innovation across the organization.</p>\n<p>Key responsibilities include designing, developing, and maintaining business solutions using the Microsoft Power Platform (Power Apps, Power Automate, Power BI, Dataverse), building and enhancing both model-driven and canvas Power Apps, developing and configuring Dynamics 365 solutions, implementing business process automation with Power Automate, and supporting data and document migration activities for Dynamics 365 and Dataverse.</p>\n<p>The ideal candidate will have successfully completed a Bachelor&#39;s or Master&#39;s degree in Information Technology or an equivalent education, with at least 2–5 years of experience developing solutions with Microsoft Power Platform and/or Dynamics 365. They will also have hands-on experience with Power Apps (model-driven and/or canvas), Power Automate, Dataverse, Power BI, and Power Virtual Agents, as well as a solid understanding of ETL concepts, data modeling, and system integrations.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ad0816b8-15c","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Porsche Engineering Services GmbH","sameAs":"https://jobs.porsche.com","logo":"https://logos.yubhub.co/jobs.porsche.com.png"},"x-apply-url":"https://jobs.porsche.com/index.php?ac=jobad&id=20058","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Microsoft Power Platform","Dynamics 365","Power Apps","Power Automate","Power BI","Dataverse","ETL concepts","data modeling","system integrations"],"x-skills-preferred":[],"datePosted":"2026-04-22T17:24:46.650Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Cluj"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Microsoft Power Platform, Dynamics 365, Power Apps, Power Automate, Power BI, Dataverse, ETL concepts, data modeling, system integrations"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4cd24ec0-18d"},"title":"Praktikum Controlling – Digitalisierung & KI aktiv mitgestalten","description":"<p>As a member of our Data &amp; AI Team, you will contribute to the development and implementation of modern reporting and KI solutions that support data-driven decision-making processes in Controlling. Your tasks will include:</p>\n<p>Developing and implementing innovative planning tools Business Intelligence with SAP Analytics Cloud: creating and enhancing dashboards and reports for management Data modelling and integration with SAP Datasphere to create powerful data structures for analysis and forecasting Implementing and testing small machine learning solutions to automate and improve Controlling processes Supporting the Data &amp; AI Team in developing the strategy, methods, and tools for digital transformation in Controlling Collaborating in interdisciplinary projects focused on digitalisation and data-based process optimisation</p>\n<p>You will work closely with our Data &amp; AI Team to design and implement cutting-edge solutions that drive business success. Your contributions will help shape the future of Controlling at Porsche.</p>\n<p>In this role, you will have the opportunity to work on various projects, including:</p>\n<p>Developing and implementing new reporting and analytics solutions Enhancing existing dashboards and reports Creating data models and integrating them with SAP Datasphere Implementing machine learning algorithms to automate and improve Controlling processes Collaborating with cross-functional teams to drive digital transformation</p>\n<p>As a member of our team, you will be part of a dynamic and innovative environment where you can grow professionally and personally. You will have access to state-of-the-art technology and tools, as well as opportunities for professional development and networking.</p>\n<p>If you are passionate about data-driven decision-making, innovation, and collaboration, we encourage you to apply for this exciting opportunity.</p>\n<p>Key responsibilities:</p>\n<ul>\n<li>Develop and implement modern reporting and KI solutions</li>\n<li>Create and enhance dashboards and reports for management</li>\n<li>Design and implement data models and integrate them with SAP Datasphere</li>\n<li>Implement and test small machine learning solutions</li>\n<li>Support the Data &amp; AI Team in developing the strategy, methods, and tools for digital transformation</li>\n<li>Collaborate in interdisciplinary projects focused on digitalisation and data-based process optimisation</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s degree in Economics, Computer Science, Mathematics, Data Science, Business Administration, or a related field</li>\n<li>First-hand experience in Business Intelligence, preferably with SAP Analytics Cloud and SAP Datasphere</li>\n<li>Basic knowledge of data modelling, data analysis, and machine learning</li>\n<li>Proficiency in MS Office and interest in modern database and cloud technologies</li>\n<li>Self-motivated, structured, and analytical working style</li>\n<li>Enjoy working in a team and being part of innovative digital solutions</li>\n<li>Flexibility and high motivation to learn and apply new technologies in Controlling</li>\n<li>Excellent German language skills and good English language skills</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Personal mentors and an open feedback culture</li>\n<li>Digital learning and working tools</li>\n<li>Mobile learning after coordination with the team</li>\n<li>Own project work</li>\n<li>Team events on a voluntary basis</li>\n<li>Active internship community</li>\n</ul>\n<p>Duration: 5-6 months</p>\n<p>Start date: September</p>\n<p>Location: Zuffenhausen</p>\n<p>We look forward to receiving your application!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4cd24ec0-18d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Dr. Ing. h.c. F. Porsche AG","sameAs":"https://jobs.porsche.com","logo":"https://logos.yubhub.co/jobs.porsche.com.png"},"x-apply-url":"https://jobs.porsche.com/index.php?ac=jobad&id=19216","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SAP Analytics Cloud","SAP Datasphere","Machine Learning","Data Modelling","Business Intelligence","Python","R","SQL"],"x-skills-preferred":[],"datePosted":"2026-04-22T17:24:38.605Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Zuffenhausen"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Automotive","skills":"SAP Analytics Cloud, SAP Datasphere, Machine Learning, Data Modelling, Business Intelligence, Python, R, SQL"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_db261609-388"},"title":"Principal Data & Ontology Architect - AI Enablement","description":"<p>We are looking for a Principal Data &amp; Ontology Architect to support the implementation and adoption of data and ontology enablement practices and standards within Control Tower Operations to support scalable, governed, and business-aligned AI initiatives.</p>\n<p>The successful candidate will serve as the primary bridge between Business Units, Global IT, and Control Tower Operations, ensuring shared understanding of data practices, workflows, and requirements. They will apply established standards for semantic modeling, domain alignment, concept reuse, and ontology lifecycle management.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Supporting the implementation and ongoing maintenance of ontology enablement practices and operating model strategy to support AI, analytics, and digital initiatives across multiple Business Units</li>\n<li>Applying established standards for semantic modeling, domain alignment, concept reuse, and ontology lifecycle management</li>\n<li>Serving as the enterprise subject-matter authority for ontology-related topics, providing recommendations and guidance to governance and leadership forums</li>\n<li>Collaborating with Global IT and enterprise data architecture to ensure ontology practices align with enterprise data platforms and Control Tower operational processes</li>\n</ul>\n<ul>\n<li>Partnering with Business Units to understand domain concepts, terminology, operational data, and AI use cases, translating them into ontology-aligned data structures</li>\n<li>Guiding Business Units in contributing domain models, metadata, and data assets into the enterprise ontology using defined governance and intake processes</li>\n<li>Enabling repeatable onboarding of Business Unit data into AI initiatives, reducing reliance on ad-hoc IT engagement and minimizing duplicated effort</li>\n</ul>\n<ul>\n<li>Serving as a liaison between Business Units and Global IT for AI data and ontology-related matters</li>\n<li>Engaging with Global IT teams to understand enterprise data platforms, workflows, standards, and operational constraints</li>\n<li>Translating Global IT practices, requirements, and workflows into clear, actionable guidance for Business Unit data stewards</li>\n</ul>\n<ul>\n<li>Educating, guiding, and supporting Business Unit data stewards on their roles in data governance, ontology contribution, and AI data enablement</li>\n<li>Supporting the development and documentation of workflows, expectations, and operating models for how BU data stewards engage with the Control Tower and Global IT</li>\n</ul>\n<ul>\n<li>Ensuring Business Unit Data Stewards understand how to prepare, govern, and submit data assets for ontology integration and AI use</li>\n<li>Promoting consistent adoption of governance, quality, and semantic standards across Business Units</li>\n</ul>\n<ul>\n<li>Supporting integration of data and ontology enablement into Control Tower workflows</li>\n<li>Providing operational insight into data readiness, semantic risks, and governance gaps to inform Control Tower decision-making</li>\n<li>Identifying systemic issues and contributing recommendations to drive continuous improvement of data enablement processes</li>\n</ul>\n<ul>\n<li>Ensuring semantic integrity, data quality, lineage, and consistency are maintained as data assets flow into AI solutions</li>\n<li>Identifying systemic issues and recommending continuous improvement opportunities to Control Tower Operations leadership</li>\n<li>Influencing corrective actions, tooling investments, or governance updates to mitigate long-term risk</li>\n</ul>\n<p>This role requires a minimum of 10 years of relevant work experience in data architecture, data governance, ontology development, semantic modeling, or related disciplines, supporting cross-functional initiatives spanning multiple business units and IT organizations.</p>\n<p>The ideal candidate will have in-depth expertise in ontology design, semantic modeling, and domain-driven data architecture, as well as experience contributing to the development and implementation of data and ontology strategies. They will also have demonstrated experience serving as a bridge between business stakeholders and IT organizations, with a strong ability to translate technical platforms, workflows, and constraints into business-understandable guidance.</p>\n<p>A Bachelor&#39;s level degree or diploma in Computer Science, Data Science/Engineering, Applied Mathematics/Statistics, Electronics/Electrical, Information Technology/Information Sciences, or a related field of study is required. A Master&#39;s or Ph.D. degree is preferred.</p>\n<p>The successful candidate will be comfortable operating in ambiguous, evolving environments with enterprise-level impact, and will have a systems-thinking mindset with understanding of AI, analytics, and enterprise data platforms.</p>\n<p>Highly desirable skills include proficiency in OWL (Web Ontology Language), RDF/RDFS – graph-based data model, storage in graph databases such as Neo4j or Amazon Neptune, and querying using SPARQL for RDF-based ontologies.</p>\n<p>This is an onsite job based at our ADC, Raymond, OH office. One telecommuting workday per week may be possible with prior departmental approval.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_db261609-388","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Honda","sameAs":"https://careers.honda.com","logo":"https://logos.yubhub.co/careers.honda.com.png"},"x-apply-url":"https://careers.honda.com/us/en/job/10812/Principal-Data-Ontology-Architect-AI-Enablement","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$120,400.00 - $150,500.00","x-skills-required":["ontology design","semantic modeling","domain-driven data architecture","data governance","AI data enablement","data quality","lineage","consistency","OWL (Web Ontology Language)","RDF/RDFS – graph-based data model","graph databases","Neo4j","Amazon Neptune","SPARQL"],"x-skills-preferred":["ontology development","data architecture","data science","electronics","electrical","information technology","information sciences"],"datePosted":"2026-04-22T17:21:24.743Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Raymond"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Automotive","skills":"ontology design, semantic modeling, domain-driven data architecture, data governance, AI data enablement, data quality, lineage, consistency, OWL (Web Ontology Language), RDF/RDFS – graph-based data model, graph databases, Neo4j, Amazon Neptune, SPARQL, ontology development, data architecture, data science, electronics, electrical, information technology, information sciences","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120400,"maxValue":150500,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5f4a61c4-ad0"},"title":"Associate Manager, Amazon Insights and Analytics","description":"<p>At Bayer, we&#39;re seeking an Associate Manager, Amazon Insights and Analytics to join our Consumer Health team. As a key member of our marketing department, you will be responsible for transforming complex Amazon datasets into actionable sales strategies. Your primary goal will be to identify trends in customer behavior, help optimize promotional spend, and provide actionable insights to support revenue growth.</p>\n<p>Your tasks and responsibilities will include: Transforming weekly performance data into concise, executive-ready performance drivers and drags that explain why sales moved and what the immediate next step should be. Trend identification: highlighting emerging consumer search patterns before they become mainstream. Analyzing Amazon Brand Analytics and profitero and Circana data to understand consumer performance and market share performance. Conducting post-event deep dives for tentpole events (e.g., Prime Day, Black Friday) to measure ROI and inform future forecasting, promo depth, and participation. Monitoring competitor pricing moves and in-market performance to highlight any action that needs to be taken. Conversion optimization: monitoring the Buy Box and Glance Views to alert the Sales team of any traffic drops or conversion leaks that require immediate action. Consumer sentiment analysis: monitoring review counts and star ratings to provide Sales with feedback on product quality or Frequently Bought Together trends. Dashboard ownership: designing and maintaining automated sales reporting frameworks using Power BI, Tableau, Excel, and any new reporting tools. KPI management: defining and monitoring critical metrics, including Topline Sales, Glance Views, Conversion Rate (CR), Subscription, Profitability, and additional priority KPIs. Standardized reporting: delivering weekly, monthly, and quarterly business reviews to sales team and key internal stakeholders, highlighting Wins and Opportunities based on sales volume and market share. New item launches: creating the Launch Blueprint for new SKUs, using historical category data to set realistic sales targets and advertising benchmarks. Tracking competitor out-of-stock events, pricing, or promotional changes to provide Sales with live-time opportunities/challenges.</p>\n<p>To succeed in this role, you will need: A bachelor&#39;s degree in Business, Finance, Economics, Statistics, or a related analytical field. 2-4+ years of experience in e-commerce analytics, retail, or a CPG environment. Advanced Excel skills, including pivot tables, VLOOKUP/XLOOKUP, and complex data modeling. Proven experience building dashboards in Power BI or Tableau. A sales-first mindset, with the ability to see a data point and immediately translate it into a revenue-generating idea. Agility, with comfort working in a fast-paced environment where data is needed quickly. Experience with third-party Amazon tools, such as Pacvue, Helium10, CommerceIQ, Stackline, NielsenIQ, Profitero, or similar tools. Amazon fluency, with expertise in Vendor Central and familiarity with Amazon Marketing Cloud.</p>\n<p>As an Associate Manager, Amazon Insights and Analytics, you can expect to be paid a salary of approximately $115-173k, with additional compensation possible through a bonus or incentive program. Benefits include health care, vision, dental, retirement, PTO, sick leave, and more.</p>\n<p>If you&#39;re interested in joining our team and contributing to our mission of Health for all, Hunger for none, please apply now.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5f4a61c4-ad0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Bayer","sameAs":"https://talent.bayer.com","logo":"https://logos.yubhub.co/talent.bayer.com.png"},"x-apply-url":"https://talent.bayer.com/careers/job/562949976752151","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$115-173k","x-skills-required":["data analysis","Amazon insights","analytics","Power BI","Tableau","Excel","VLOOKUP/XLOOKUP","complex data modeling","sales strategy","trend identification","consumer behavior","promotional spend","ROI analysis","forecasting","advertising benchmarks"],"x-skills-preferred":["third-party Amazon tools","Pacvue","Helium10","CommerceIQ","Stackline","NielsenIQ","Profitero","Amazon fluency","Vendor Central","Amazon Marketing Cloud"],"datePosted":"2026-04-18T22:13:09.015Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Whippany"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Marketing","industry":"Healthcare","skills":"data analysis, Amazon insights, analytics, Power BI, Tableau, Excel, VLOOKUP/XLOOKUP, complex data modeling, sales strategy, trend identification, consumer behavior, promotional spend, ROI analysis, forecasting, advertising benchmarks, third-party Amazon tools, Pacvue, Helium10, CommerceIQ, Stackline, NielsenIQ, Profitero, Amazon fluency, Vendor Central, Amazon Marketing Cloud","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":173000,"maxValue":173000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7275ef33-009"},"title":"Staff Data Engineer","description":"<p>At Bayer, we&#39;re seeking a Staff Data Engineer to join our team. As a Staff Data Engineer, you will design and lead the implementation of data flows to connect operational systems, data for analytics and business intelligence (BI) systems. You will recognize opportunities to reuse existing data flows, lead the build of data streaming systems, optimize the code to ensure processes perform optimally, and lead work on database management.</p>\n<p>Communicating Between Technical and Non-Technical Colleagues</p>\n<p>As a Staff Data Engineer, you will communicate effectively with technical and non-technical stakeholders, support and host discussions within a multidisciplinary team, and be an advocate for the team externally.</p>\n<p>Data Analysis and Synthesis</p>\n<p>You will undertake data profiling and source system analysis, present clear insights to colleagues to support the end use of the data.</p>\n<p>Data Development Process</p>\n<p>You will design, build and test data products that are complex or large scale, build teams to complete data integration services.</p>\n<p>Data Innovation</p>\n<p>You will understand the impact on the organization of emerging trends in data tools, analysis techniques and data usage.</p>\n<p>Data Integration Design</p>\n<p>You will select and implement the appropriate technologies to deliver resilient, scalable and future-proofed data solutions and integration pipelines.</p>\n<p>Data Modeling</p>\n<p>You will produce relevant data models across multiple subject areas, explain which models to use for which purpose, understand industry-recognised data modelling patterns and standards, and when to apply them, compare and align different data models.</p>\n<p>Metadata Management</p>\n<p>You will design an appropriate metadata repository and present changes to existing metadata repositories, understand a range of tools for storing and working with metadata, provide oversight and advice to more inexperienced members of the team.</p>\n<p>Problem Resolution</p>\n<p>You will respond to problems in databases, data processes, data products and services as they occur, initiate actions, monitor services and identify trends to resolve problems, determine the appropriate remedy and assist with its implementation, and with preventative measures.</p>\n<p>Programming and Build</p>\n<p>You will use agreed standards and tools to design, code, test, correct and document moderate-to-complex programs and scripts from agreed specifications and subsequent iterations, collaborate with others to review specifications where appropriate.</p>\n<p>Technical Understanding</p>\n<p>You will understand the core technical concepts related to the role, and apply them with guidance.</p>\n<p>Testing</p>\n<p>You will review requirements and specifications, and define test conditions, identify issues and risks associated with work, analyse and report test activities and results.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7275ef33-009","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Bayer","sameAs":"https://talent.bayer.com","logo":"https://logos.yubhub.co/talent.bayer.com.png"},"x-apply-url":"https://talent.bayer.com/careers/job/562949976928777","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$114,400 to $171,600","x-skills-required":["Proficiency in programming language such as Python or Java","Experience with Big Data technologies such as Hadoop, Spark, and Kafka","Familiarity with ETL processes and tools","Knowledge of SQL and NoSQL databases","Strong understanding of relational databases","Experience with data warehousing solutions","Proficiency with cloud platforms","Expertise in data modeling and design","Experience in designing and building scalable data pipelines","Experience with RESTful APIs and data integration"],"x-skills-preferred":["Relevant certifications (e.g., GCP Certified, AWS Certified, Azure Certified)","Bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related field","Strong analytical and communication skills","Ability to work collaboratively in a team environment","High level of accuracy and attention to detail"],"datePosted":"2026-04-18T22:12:56.654Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"Proficiency in programming language such as Python or Java, Experience with Big Data technologies such as Hadoop, Spark, and Kafka, Familiarity with ETL processes and tools, Knowledge of SQL and NoSQL databases, Strong understanding of relational databases, Experience with data warehousing solutions, Proficiency with cloud platforms, Expertise in data modeling and design, Experience in designing and building scalable data pipelines, Experience with RESTful APIs and data integration, Relevant certifications (e.g., GCP Certified, AWS Certified, Azure Certified), Bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related field, Strong analytical and communication skills, Ability to work collaboratively in a team environment, High level of accuracy and attention to detail","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":114400,"maxValue":171600,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_90b5ac1d-d16"},"title":"Senior Software Engineer, Backend — Frontier Data","description":"<p>The Frontier Data team builds the data and systems that power Scale&#39;s most advanced Frontier AI use cases. We&#39;re looking for a Senior Backend Engineer who thrives in ambiguity, moves fast, and enjoys tackling daunting challenges.</p>\n<p>As a Senior Backend Engineer, you will own major backend systems for frontier agentic data products, driving projects from early exploration through production deployment. You will build scalable services and pipelines that support agent workflows, architect modular, reusable backend systems, and operate in high-ambiguity environments.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Designing and building scalable systems while partnering closely with research, product, operations, and other engineering teams</li>\n<li>Building scalable services and pipelines that support agent workflows</li>\n<li>Architecting modular, reusable backend systems that adapt to evolving product needs</li>\n<li>Operating in high-ambiguity environments and breaking down open-ended problems</li>\n<li>Partnering cross-functionally with product, research/ML, and infrastructure teams</li>\n</ul>\n<p>Ideal experience includes 5+ years of full-time software engineering experience, strong backend engineering fundamentals, and experience building systems that scale.</p>\n<p>Compensation packages at Scale include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors.</p>\n<p>Additional benefits include comprehensive health, dental, and vision coverage, retirement benefits, a learning and development stipend, and generous PTO.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_90b5ac1d-d16","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Frontier Data","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4648525005","x-work-arrangement":null,"x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$216,000-$270,000 USD","x-skills-required":["Distributed systems","API design","Data modeling","Production reliability","Docker","Containerized development/production environments","SQL","Modern database-backed application development"],"x-skills-preferred":["Async processing","Workflow engines","Data pipelines"],"datePosted":"2026-04-18T16:01:34.567Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA; New York, NY"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Distributed systems, API design, Data modeling, Production reliability, Docker, Containerized development/production environments, SQL, Modern database-backed application development, Async processing, Workflow engines, Data pipelines","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":216000,"maxValue":270000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3fa0b80f-842"},"title":"Staff Software Engineer, Public Sector","description":"<p>Job Title: Staff Software Engineer, Public Sector</p>\n<p>We are seeking a highly skilled Staff Software Engineer to join our Public Sector team. As a Staff Software Engineer, you will be responsible for designing and implementing software solutions for the public sector. You will work closely with cross-functional teams to develop and deploy software applications that meet the needs of government agencies.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design and implement software solutions for the public sector</li>\n<li>Work closely with cross-functional teams to develop and deploy software applications</li>\n<li>Collaborate with stakeholders to understand their needs and develop software solutions that meet those needs</li>\n<li>Develop and maintain software documentation</li>\n<li>Participate in code reviews and ensure that code meets quality standards</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s degree in Computer Science or related field</li>\n<li>5+ years of experience in software development</li>\n<li>Proficiency in programming languages such as Java, Python, or C++</li>\n<li>Experience with Agile development methodologies</li>\n<li>Strong understanding of software design patterns and principles</li>\n<li>Excellent communication and collaboration skills</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Master&#39;s degree in Computer Science or related field</li>\n<li>10+ years of experience in software development</li>\n<li>Experience with cloud-based technologies such as AWS or Azure</li>\n<li>Experience with DevOps practices</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Competitive salary and benefits package</li>\n<li>Opportunities for professional growth and development</li>\n<li>Collaborative and dynamic work environment</li>\n</ul>\n<p>Salary Range: $252,000-$362,000 USD</p>\n<p>Required Skills:</p>\n<ul>\n<li>Full Stack Development</li>\n<li>Cloud-Native Technologies</li>\n<li>Data Engineering</li>\n<li>AI Application Integration</li>\n<li>Problem Solving</li>\n<li>Collaboration and Communication</li>\n<li>Adaptability and Learning Agility</li>\n</ul>\n<p>Preferred Skills:</p>\n<ul>\n<li>Experience with modern web development frameworks</li>\n<li>Familiarity with cloud platforms</li>\n<li>Understanding of containerization and container orchestration</li>\n<li>Knowledge of ETL processes</li>\n<li>Understanding of data modeling, data warehousing, and data governance principles</li>\n<li>Familiarity with integrating Large Language Models</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3fa0b80f-842","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale","sameAs":"https://www.scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4674913005","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$252,000-$362,000 USD","x-skills-required":["Full Stack Development","Cloud-Native Technologies","Data Engineering","AI Application Integration","Problem Solving","Collaboration and Communication","Adaptability and Learning Agility"],"x-skills-preferred":["Experience with modern web development frameworks","Familiarity with cloud platforms","Understanding of containerization and container orchestration","Knowledge of ETL processes","Understanding of data modeling, data warehousing, and data governance principles","Familiarity with integrating Large Language Models"],"datePosted":"2026-04-18T16:00:27.694Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA; St. Louis, MO; New York, NY; Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Full Stack Development, Cloud-Native Technologies, Data Engineering, AI Application Integration, Problem Solving, Collaboration and Communication, Adaptability and Learning Agility, Experience with modern web development frameworks, Familiarity with cloud platforms, Understanding of containerization and container orchestration, Knowledge of ETL processes, Understanding of data modeling, data warehousing, and data governance principles, Familiarity with integrating Large Language Models","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":252000,"maxValue":362000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5465ae79-24e"},"title":"Analytics & Systems Lead, Finance","description":"<p>We&#39;re seeking an experienced Analytics &amp; Systems Lead to join our Finance team. In this role, you will work closely with stakeholders across business finance, corporate finance, and accounting to design and develop internal tools and AI agents that automate workflows across finance and accounting.</p>\n<p>You will design data models, prototype internal tools, and implement agent-driven workflows using our internal data infrastructure, system integration tooling, Scale&#39;s proprietary AI platform, and emerging AI tools.</p>\n<p>Key responsibilities include designing and developing end-to-end agent-driven workflows, building scalable data models and pipelines, partnering with stakeholders to translate business requirements into technical requirements, and collaborating with engineering teams to develop internal tools.</p>\n<p>Ideal candidates will have 5+ years of experience in data analytics, analytics engineering, or data science roles, expert knowledge of SQL and Python, and experience building internal tools, automation systems, or data products.</p>\n<p>Compensation packages at Scale include base salary, equity, and benefits, with a base salary range of $200,000-$250,000 USD for this full-time position in San Francisco.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5465ae79-24e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4673090005","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$200,000-$250,000 USD","x-skills-required":["SQL","Python","data analysis","data modeling","internal tools","automation systems","data products"],"x-skills-preferred":["JavaScript","modern automation tooling"],"datePosted":"2026-04-18T15:59:26.227Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, data analysis, data modeling, internal tools, automation systems, data products, JavaScript, modern automation tooling","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":200000,"maxValue":250000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_10cff9d9-bba"},"title":"Senior Compensation Partner","description":"<p>About Gusto</p>\n<p>At Gusto, we&#39;re on a mission to grow the small business economy. We handle the hard stuff , payroll, health insurance, 401(k)s, and HR , so owners can focus on their craft and their customers.</p>\n<p>With teams in Denver, San Francisco, and New York, we support more than 400,000 small businesses nationwide and are building a workplace that reflects the people we serve.</p>\n<p>All full-time employees receive competitive base pay, benefits, and equity (RSUs) , because everyone who helps build Gusto should share in its success. Offer amounts are determined by role, level, and location. Learn more about our Total Rewards philosophy.</p>\n<p>AI is a fundamental part of how work gets done at Gusto. We expect all team members to actively engage with AI tools relevant to their role and grow their fluency as the technology evolves. AI experience requirements vary by role and will be assessed during the interview process.</p>\n<p>About the Role: The Senior Compensation Partner is a core member of Gusto&#39;s People team, responsible for building and scaling the compensation programs that help us attract, retain, and reward top talent. This role exists to bring rigor, fairness, and strategic clarity to how Gusto thinks about pay , operating at the intersection of data, policy, and people.</p>\n<p>Operating with a systems-first, AI-aware mindset, the Senior Compensation Partner uses automation, scalable workflows, and well-designed tooling to improve decision quality, ensure compliance, and reduce reliance on manual, one-off analysis. This role partners closely with HR Business Partners, Finance, Legal, and business leaders to ensure our compensation programs are grounded in market reality and aligned to our values.</p>\n<p>Success in this role is measured by the quality and fairness of compensation decisions, the scalability and reliability of our programs, and the degree to which the organization can operate with confidence , without constant intervention or heroics.</p>\n<p>Gusto&#39;s compensation function, built on first-principles thinking, is at an inflection point , scaling from early foundations to a mature, high-trust operation. You&#39;ll have real ownership to shape how we design and deliver compensation as the company grows.</p>\n<p>Here’s what you’ll do day-to-day:</p>\n<ul>\n<li>Develop, implement, and communicate Gusto&#39;s compensation programs, policies, and procedures, ensuring internal equity, external competitiveness, and alignment with business strategy.</li>\n</ul>\n<ul>\n<li>Partner with and influence key stakeholders , including HRBPs, Finance, and senior leadership , to translate compensation philosophy into clear, actionable programs.</li>\n</ul>\n<ul>\n<li>Lead annual compensation cycles (merit, equity refresh, benchmarking) and own the end-to-end process from data collection through communication.</li>\n</ul>\n<ul>\n<li>Perform market analyses and compensation studies using survey data (e.g., Radford/Aon, Mercer, Pave) to recommend pay ranges, job levels, and offer positioning.</li>\n</ul>\n<ul>\n<li>Use AI tools and automation thoughtfully to accelerate analysis, synthesize benchmarking data, and reduce planning and reporting load , while maintaining sound judgment around data quality and decision guardrails.</li>\n</ul>\n<ul>\n<li>Design and maintain scalable compensation workflows and tooling , automating recurring analyses, audit processes, and reporting so programs run reliably without manual intervention.</li>\n</ul>\n<ul>\n<li>Build and maintain compensation dashboards and data visualizations that give leaders real-time insight into pay equity, market positioning, and budget utilization.</li>\n</ul>\n<ul>\n<li>Craft competitive, compelling offers to attract top-tier talent, and build playbooks to help recruiters and HRBPs communicate Gusto&#39;s total rewards story.</li>\n</ul>\n<ul>\n<li>Ensure ongoing compliance with federal, state, and local pay regulations , including pay transparency laws, FLSA classification standards, and emerging pay equity requirements , and proactively flag legislative changes.</li>\n</ul>\n<ul>\n<li>Partner with People Operations to integrate and optimize HRIS and compensation tools, ensuring data accuracy and system scalability as the company grows.</li>\n</ul>\n<p>Here’s what we&#39;re looking for:</p>\n<ul>\n<li>8+ years of experience in Compensation, Total Rewards, People Operations, or a related analytical function , ideally in a high-growth technology environment.</li>\n</ul>\n<ul>\n<li>AI fluency: hands-on experience using AI tools or automation to accelerate analysis or synthesis, paired with strong judgment around trust and validation.</li>\n</ul>\n<ul>\n<li>Hands-on experience running compensation cycles (merit, equity, benchmarking) and building scalable programs from early-stage to maturity.</li>\n</ul>\n<ul>\n<li>Survey benchmarking fluency: direct experience with Radford/Aon, Mercer, or comparable compensation survey tools and methodologies.</li>\n</ul>\n<ul>\n<li>Systems-first orientation: experience building durable workflows, automating reports, or designing operating mechanisms that reduce recurring manual work.</li>\n</ul>\n<ul>\n<li>Strong analytical foundation: advanced spreadsheet skills (Excel, Google Sheets), comfort with data modeling, and experience building dashboards or visualizations that leaders actually use.</li>\n</ul>\n<ul>\n<li>Knowledge of compensation and employment law, including FLSA, pay transparency statutes, and pay equity frameworks , or foundational knowledge with a clear drive to build expertise.</li>\n</ul>\n<ul>\n<li>Clear, concise communicator who can translate complex pay data into narratives and recommendations for non-specialist audiences, including executives.</li>\n</ul>\n<ul>\n<li>Comfort operating at multiple altitudes , from hands-on analysis and modeling to advising leadership on compensation philosophy and strategy.</li>\n</ul>\n<ul>\n<li>High integrity and intellectual curiosity; someone who asks hard questions about fairness and market position, and isn&#39;t satisfied with the status quo.</li>\n</ul>\n<ul>\n<li>Background in Mathematics, Finance, Statistics, Economics, or a related quantitative field is preferred but not required.</li>\n</ul>\n<p>Gusto offers competitive cash compensation, equity, and a comprehensive benefits package. Our cash compensation range for this role is $124,000 to $150,000 in Denver, and $152,000 to $185,000 in San Francisco and New York. Final offer amounts are determined by multiple factors, including candidate location, experience, and expertise. Gusto&#39;s Total Rewards philosophy is rooted in transparency, fairness, and the belief that people do their best work when they feel secure and valued. As a member of the compensation team, you&#39;ll have unique visibility into , and influence over , the programs that define that experience for every Gustie.</p>\n<p>Gusto has physical office spaces in Denver, San Francisco, and New York City. Employees who are based in those locations will be expected to work from the office on designated days approximately 2-3 days per week (or more depending on role). The same office expectations apply to all Symmetry roles, Gusto&#39;s subsidiary, whose physical office is in Scottsdale. Note: The San Francisco office expectations encompass both the San Francisco and San Jose metro areas. When approved to work from a location other than a Gusto office, a secure, reliable, and consistent internet connection is required. This includes non-office days for hybrid employees.</p>\n<p>Our customers come from all walks of life and so do we. We hire great people from a wide variety of backgrounds, not just because it&#39;s the right thing to do, but because it makes our company stronger. If you share our values and our enthusiasm for small businesses, you will find a home at Gusto. Gusto is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decision based on race, color, religion, national origin, age, sex (including pregnancy, childbirth, or related medical conditions), marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or other applicable legally protected characteristic.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_10cff9d9-bba","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Gusto","sameAs":"https://www.gusto.com/","logo":"https://logos.yubhub.co/gusto.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/gusto/jobs/7677492","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$124,000 to $150,000 in Denver, and $152,000 to $185,000 in San Francisco and New York","x-skills-required":["Compensation","Total Rewards","People Operations","Analytical function","AI tools","Automation","Survey benchmarking","Compensation survey tools","Data modeling","Spreadsheet skills","Data visualization","Pay equity frameworks"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:58:27.770Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Denver, CO;San Francisco, CA;New York, NY"}},"employmentType":"FULL_TIME","occupationalCategory":"HR","industry":"Technology","skills":"Compensation, Total Rewards, People Operations, Analytical function, AI tools, Automation, Survey benchmarking, Compensation survey tools, Data modeling, Spreadsheet skills, Data visualization, Pay equity frameworks","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":124000,"maxValue":185000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7f1a5b85-116"},"title":"Mission Software Engineer, Public Sector","description":"<p>We are seeking a highly skilled and motivated Mission Software Engineer to join our dynamic Federal Engineering team. As a part of this team, you will play a critical role in supporting Scale&#39;s government customers by scoping and developing onsite solutions.</p>\n<p>Our scalable, high-performance platform is the foundation for these customer solutions, and your expertise will be instrumental in designing and implementing systems that can handle interactions with existing customer systems to help our products integrate into existing customer workflows.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Work directly with customers to understand their problems and translate those into features in Scale&#39;s platform.</li>\n<li>Be open to &gt;50% travel or relocation to a key customer geographic location.</li>\n<li>Collaborate with cross-functional teams to define and execute the vision for backend solutions, ensuring they meet the unique needs of government agencies operating in secure environments.</li>\n<li>Implement end-to-end data integrations, syncing customer&#39;s data to Scale&#39;s platform and back.</li>\n<li>Deploy and maintain Scale software at customer sites.</li>\n<li>Develop customer requested features and work closely with them to ensure that they win customer love.</li>\n<li>Build robust and reliable backend systems that can serve as standalone products, empowering customers to accelerate their own AI ambitions.</li>\n<li>Participate actively in customer engagements, working closely with stakeholders to understand requirements and deliver innovative solutions.</li>\n</ul>\n<p>Ideal Candidate:</p>\n<ul>\n<li>Track record of success as a hybrid customer facing engineer, forward deployed software engineer, and ability to quickly adapt to different roles.</li>\n<li>Prior experience developing with Python and JavaScript, or other modern software languages. Familiarity with Node and React is a plus.</li>\n<li>Cloud-Native Technologies: Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and experience in developing and deploying applications in a cloud-native environment. Understanding of containerization (e.g., Docker) and container orchestration (e.g., Kubernetes) is a plus.</li>\n<li>Linux experience: Understanding of shell scripting, operating systems, etc.</li>\n<li>Networking experience: Understanding of networking technologies, configuration (ports, protocols, etc)</li>\n<li>Data Engineering: Knowledge of ETL (Extract, Transform, Load) processes and experience in building data pipelines to integrate and process diverse data sources. Understanding of data modeling, data warehousing, and data governance principles.</li>\n<li>Problem Solving: Strong analytical and problem-solving skills to understand complex challenges and devise effective solutions. Ability to think critically, identify root causes, and propose innovative approaches to overcome technical obstacles.</li>\n</ul>\n<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training. Scale employees in eligible roles are also granted equity based compensation, subject to Board of Director approval.</p>\n<p>Benefits:</p>\n<ul>\n<li>Comprehensive health, dental and vision coverage,</li>\n<li>Retirement benefits,</li>\n<li>A learning and development stipend,</li>\n<li>Generous PTO,</li>\n<li>Commuter stipend</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7f1a5b85-116","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4481921005","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$138,000-$292,560 USD","x-skills-required":["Python","JavaScript","Node","React","Cloud-Native Technologies","Linux","Networking","Data Engineering","ETL","Data Modeling","Data Warehousing","Data Governance"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:58:00.256Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Boston, Massachusetts ; Honolulu, HI; San Diego, CA; San Francisco, CA; St. Louis, MO; New York, NY; Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, JavaScript, Node, React, Cloud-Native Technologies, Linux, Networking, Data Engineering, ETL, Data Modeling, Data Warehousing, Data Governance","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":138000,"maxValue":292560,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d5f768d1-df6"},"title":"Full-Stack Engineer, AI Data Platform","description":"<p>Shape the Future of AI</p>\n<p>At Labelbox, we&#39;re building the critical infrastructure that powers breakthrough AI models at leading research labs and enterprises. Since 2018, we&#39;ve been pioneering data-centric approaches that are fundamental to AI development, and our work becomes even more essential as AI capabilities expand exponentially.</p>\n<p>We&#39;re the only company offering three integrated solutions for frontier AI development:</p>\n<ul>\n<li>Enterprise Platform &amp; Tools: Advanced annotation tools, workflow automation, and quality control systems that enable teams to produce high-quality training data at scale</li>\n</ul>\n<ul>\n<li>Frontier Data Labeling Service: Specialized data labeling through Alignerr, leveraging subject matter experts for next-generation AI models</li>\n</ul>\n<ul>\n<li>Expert Marketplace: Connecting AI teams with highly skilled annotators and domain experts for flexible scaling</li>\n</ul>\n<p>Why Join Us</p>\n<ul>\n<li>High-Impact Environment: We operate like an early-stage startup, focusing on impact over process. You&#39;ll take on expanded responsibilities quickly, with career growth directly tied to your contributions.</li>\n</ul>\n<ul>\n<li>Technical Excellence: Work at the cutting edge of AI development, collaborating with industry leaders and shaping the future of artificial intelligence.</li>\n</ul>\n<ul>\n<li>Innovation at Speed: We celebrate those who take ownership, move fast, and deliver impact. Our environment rewards high agency and rapid execution.</li>\n</ul>\n<ul>\n<li>Continuous Growth: Every role requires continuous learning and evolution. You&#39;ll be surrounded by curious minds solving complex problems at the frontier of AI.</li>\n</ul>\n<ul>\n<li>Clear Ownership: You&#39;ll know exactly what you&#39;re responsible for and have the autonomy to execute. We empower people to drive results through clear ownership and metrics.</li>\n</ul>\n<p>Role Overview</p>\n<p>We’re looking for a Full-Stack AI Engineer to join our team, where you’ll build the next generation of tools for developing, evaluating, and training state-of-the-art AI systems. You will own features end to end,from user-facing experiences and APIs to backend services, data models, and infrastructure.</p>\n<p>You’ll be at the heart of our applied AI efforts, with a particular focus on human-in-the-loop systems used to generate high-quality training data for Large Language Models (LLMs) and AI agents. This includes building a platform that enables us and our customers to create and evaluate data, as well as systems that leverage LLMs to assist with reviewing, scoring, and improving human submissions.</p>\n<p>Your Impact</p>\n<ul>\n<li>Own End-to-End Product Features</li>\n</ul>\n<p>Design, build, and ship complete workflows spanning frontend UI, APIs, backend services, databases, and production infrastructure.</p>\n<ul>\n<li>Enable Human-in-the-Loop AI Training</li>\n</ul>\n<p>Build systems that allow humans to efficiently create, review, and curate high-quality training and evaluation data used in AI model development.</p>\n<ul>\n<li>Support RLHF and Preference Data Workflows</li>\n</ul>\n<p>Design and implement tooling that supports RLHF-style pipelines, including task generation, human review, scoring, aggregation, and dataset versioning.</p>\n<ul>\n<li>Leverage LLMs in the Review Loop</li>\n</ul>\n<p>Build systems that use LLMs to assist human reviewers,such as automated checks, critiques, ranking suggestions, or quality signals,while maintaining human oversight.</p>\n<ul>\n<li>Advance AI Evaluation</li>\n</ul>\n<p>Design and implement evaluation frameworks and interactive tools for LLMs and AI agents across multiple data modalities (text, images, audio, video).</p>\n<ul>\n<li>Create Intuitive, Reviewer-Focused Interfaces</li>\n</ul>\n<p>Build thoughtful, efficient user interfaces (e.g., in React) optimized for high-throughput human review, quality control, and operational workflows.</p>\n<ul>\n<li>Architect Scalable Data &amp; Service Layers</li>\n</ul>\n<p>Design APIs, backend services, and data schemas that support large-scale data creation, review, and iteration with strong guarantees around correctness and traceability.</p>\n<ul>\n<li>Solve Ambiguous, Real-World Problems</li>\n</ul>\n<p>Translate loosely defined operational and research needs into practical, scalable, end-to-end systems.</p>\n<ul>\n<li>Ensure System Reliability</li>\n</ul>\n<p>Participate in on-call rotations to monitor, troubleshoot, and resolve issues across the full stack.</p>\n<ul>\n<li>Elevate the Team</li>\n</ul>\n<p>Improve engineering practices, development processes, and documentation. Share knowledge through technical writing and design discussions.</p>\n<p>What You Bring</p>\n<ul>\n<li>Bachelor’s degree in Computer Science, Data Engineering, or a related field.</li>\n</ul>\n<ul>\n<li>2+ years of experience in a software or machine learning engineering role.</li>\n</ul>\n<ul>\n<li>A proactive, product-focused mindset and a high degree of ownership, with a passion for building solutions that empower users.</li>\n</ul>\n<ul>\n<li>Experience using frontend frameworks like React/Redux and backend systems and technologies like Python, Java, GraphQL; familiarity with NodeJS and NestJS is a plus.</li>\n</ul>\n<ul>\n<li>Knowledge of designing and managing scalable database systems, including relational databases (e.g., PostgreSQL, MySQL), NoSQL stores (e.g., MongoDB, Cassandra), and cloud-native solutions (e.g., Google Spanner, AWS DynamoDB).</li>\n</ul>\n<ul>\n<li>Familiarity with cloud infrastructure like GCP (GCS, PubSub) and containerization (Kubernetes) is a plus.</li>\n</ul>\n<ul>\n<li>Excellent communication and collaboration skills.</li>\n</ul>\n<ul>\n<li>High proficiency in leveraging AI tools for daily development (e.g., Cursor, GitHub Copilot).</li>\n</ul>\n<ul>\n<li>Comfort and enthusiasm for working in a fast-paced, agile environment where rapid problem-solving is key.</li>\n</ul>\n<p>Bonus Points</p>\n<ul>\n<li>Experience building tools for AI/ML applications, particularly for data annotation, monitoring, or agent evaluation.</li>\n</ul>\n<ul>\n<li>Familiarity with data infrastructure components such as data pipelines, streaming systems, and storage architectures (e.g., Cloud Buckets, Key-Value Stores).</li>\n</ul>\n<ul>\n<li>Previous experience with search engines (e.g., ElasticSearch).</li>\n</ul>\n<ul>\n<li>Experience in optimizing databases for performance (e.g., schema design, indexing, query tuning) and integrating them with broader data workflows.</li>\n</ul>\n<p>Engineering at Labelbox</p>\n<p>At Labelbox Engineering, we&#39;re building a comprehensive platform that powers the future of AI development. Our team combines deep technical expertise with a passion for innovation, working at the intersection of AI infrastructure, data systems, and user experience. We believe in pushing technical boundaries while maintaining high standards of code quality and system reliability. Our engineering culture emphasizes autonomous decision-making, rapid iteration, and collaborative problem-solving. We&#39;ve cultivated an environment where engineers can take ownership of significant challenges, experiment with cutting-edge technologies, and see their solutions directly impact how leading AI labs and enterprises build the next generation of AI systems.</p>\n<p>Our Technology Stack</p>\n<p>Our engineering team works with a modern tech stack designed for scalability, performance, and developer efficiency:</p>\n<ul>\n<li>Frontend: React.js with Redux, TypeScript</li>\n</ul>\n<ul>\n<li>Backend: Node.js, TypeScript, Python, some Java &amp; Kotlin</li>\n</ul>\n<ul>\n<li>APIs: GraphQL</li>\n</ul>\n<ul>\n<li>Cloud &amp; Infrastructure: Google Cloud Platform (GCP), Kubernetes</li>\n</ul>\n<ul>\n<li>Databases: MySQL, Spanner, PostgreSQL</li>\n</ul>\n<ul>\n<li>Queueing / Streaming: Kafka, PubSub</li>\n</ul>\n<p>Labelbox strives to ensure pay parity across the organization and discuss compensation transparently. The expected annual base salary range for United States-based candidates is below. This range is not inclusive of any potential equity packages or additional benefits. Exact compensation varies based on a variety of factors, including skills and competencies, experience, and geographical location.</p>\n<p>Annual base salary range $130,000-$200,000 USD</p>\n<p>Life at Labelbox</p>\n<ul>\n<li>Location: Join our dedicated tech hubs in San Francisco or Wrocław, Poland</li>\n</ul>\n<ul>\n<li>Work Style: Hybrid model with 2 days per week in office, combining collaboration and flexibility</li>\n</ul>\n<ul>\n<li>Environment: Fast-paced and high-intensity, perfect for ambitious individuals who thrive on ownership and quick decision-making</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d5f768d1-df6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Labelbox","sameAs":"https://www.labelbox.com/","logo":"https://logos.yubhub.co/labelbox.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/labelbox/jobs/5019254007","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$130,000-$200,000 USD","x-skills-required":["React","Redux","Node.js","TypeScript","Python","Java","GraphQL","MySQL","PostgreSQL","Spanner","Kafka","PubSub","GCP","Kubernetes","Cloud computing","Containerization","Database management","Cloud infrastructure","API design","Backend services","Data models","Infrastructure"],"x-skills-preferred":["AI tools","Cursor","GitHub Copilot","Data annotation","Monitoring","Agent evaluation","Data infrastructure","Data pipelines","Streaming systems","Storage architectures","Search engines","ElasticSearch","Database optimization","Schema design","Indexing","Query tuning"],"datePosted":"2026-04-18T15:57:55.464Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco Bay Area"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"React, Redux, Node.js, TypeScript, Python, Java, GraphQL, MySQL, PostgreSQL, Spanner, Kafka, PubSub, GCP, Kubernetes, Cloud computing, Containerization, Database management, Cloud infrastructure, API design, Backend services, Data models, Infrastructure, AI tools, Cursor, GitHub Copilot, Data annotation, Monitoring, Agent evaluation, Data infrastructure, Data pipelines, Streaming systems, Storage architectures, Search engines, ElasticSearch, Database optimization, Schema design, Indexing, Query tuning","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":130000,"maxValue":200000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7e22bd51-5ef"},"title":"Senior Product Manager, CRM","description":"<p>We are seeking a Senior Product Manager, CRM Backend to drive the strategy and execution of our core CRM platform and data model. As a member of the Client Engagement team, you will work closely with other Product Managers, Data Engineering, and various teams across the organization that rely on customer data. This is an opportunity for you to craft the fundamental data and platform strategy for our CRM offering, ensuring its scalability and long-term viability as a critical company asset.</p>\n<p>In this role, you will define and own the CRM data model strategy, guide and execute scalable platform decisions for the CRM backend, oversee and define the strategy for third-party integrations, and leverage the latest technical advancements in data management. You will also lead and align a dedicated team of engineers, prioritize their work, and manage the technical roadmap for the CRM backend platform.</p>\n<p>We are looking for a candidate with 4+ years of experience in product management, specifically focusing on backend systems, data models, or platform products. You should have a strong technical background and experience in data management, data modeling, and data strategy. Additionally, you should be well-versed in managing technical requirements for third-party integrations and data partnerships.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7e22bd51-5ef","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Squarespace","sameAs":"https://www.squarespace.com/about/careers","logo":"https://logos.yubhub.co/squarespace.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/squarespace/jobs/7591635","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["product management","backend systems","data models","platform products","data management","data modeling","data strategy","third-party integrations","data partnerships"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:57:50.783Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dublin"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"product management, backend systems, data models, platform products, data management, data modeling, data strategy, third-party integrations, data partnerships"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_bfddfcc3-e38"},"title":"Senior Software Engineer, Public Sector","description":"<p>As a Senior Software Engineer, you will lead the development of a vertical feature or a horizontal capability to include defining requirements with stakeholders and implementation until it is accepted by the stakeholders.</p>\n<p>You will:</p>\n<p>Lead the design and implementation of scalable backend systems and distributed architectures for Federal customers. Manage the full lifecycle of feature development from requirement definition to deployment on classified networks. Direct the orchestration of asynchronous agent fleets to meet mission requirements. Lead customer engagements to translate mission needs into technical requirements. Own the communication with stakeholders to ensure implementation meets defined acceptance criteria. Conduct technical reviews and identify risks within machine learning infrastructure and model serving. Drive the platform roadmap by providing technical specifications for Federal product offerings.</p>\n<p>Ideally you will have:</p>\n<p>Full Stack Development: Proficiency in front-end, back-end development and infrastructure, including experience with modern web development frameworks, programming languages, and databases Cloud-Native Technologies: Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and experience in developing and deploying applications in a cloud-native environment. Understanding of containerization (e.g., Docker) and container orchestration (e.g., Kubernetes) is a plus Data Engineering: Knowledge of ETL (Extract, Transform, Load) processes and experience in building data pipelines to integrate and process diverse data sources. Understanding of data modeling, data warehousing, and data governance principles AI Application Integration: Familiarity with integrating Large Language Models (LLMs) and building agentic workflows. Understanding of prompt engineering, retrieval-augmented generation (RAG), and agent orchestration is beneficial. Problem Solving: Strong analytical and problem-solving skills to understand complex challenges and devise effective solutions. Ability to think critically, identify root causes, and propose innovative approaches to overcome technical obstacles Collaboration and Communication: Excellent interpersonal and communication skills to effectively collaborate with cross-functional teams, stakeholders, and customers. Ability to clearly articulate technical concepts to non-technical audiences and foster a collaborative work environment Adaptability and Learning Agility: Willingness to embrace new technologies, learn new skills, and adapt to defining and evolving project requirements. Ability to quickly grasp and apply new concepts and stay up-to-date with emerging trends in software engineering</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_bfddfcc3-e38","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale","sameAs":"https://www.scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4674911005","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$216,000-$311,000 USD (San Francisco, New York, Seattle) $194,400-$279,000 USD (Hawaii, Washington DC, Texas, Colorado) $162,400-$233,000 USD (St. Louis)","x-skills-required":["Full Stack Development","Cloud-Native Technologies","Data Engineering","AI Application Integration","Problem Solving","Collaboration and Communication","Adaptability and Learning Agility"],"x-skills-preferred":["Docker","Kubernetes","AWS","Azure","GCP","ETL","data modeling","data warehousing","data governance","Large Language Models","prompt engineering","retrieval-augmented generation","agent orchestration"],"datePosted":"2026-04-18T15:57:07.621Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA; St. Louis, MO; New York, NY; Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Full Stack Development, Cloud-Native Technologies, Data Engineering, AI Application Integration, Problem Solving, Collaboration and Communication, Adaptability and Learning Agility, Docker, Kubernetes, AWS, Azure, GCP, ETL, data modeling, data warehousing, data governance, Large Language Models, prompt engineering, retrieval-augmented generation, agent orchestration","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":162400,"maxValue":311000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_53ee0ef3-c62"},"title":"Staff Data Engineer, Analytics Data Engineering","description":"<p>We are looking for a Staff Data Engineer to join our Analytics Data Engineering (ADE) team within Data Science &amp; AI Platform. As a Staff Data Engineer, you will be responsible for solving cross-cutting data challenges that span multiple lines of business while driving standardization in how we build, deploy, and govern analytics pipelines across Dropbox.</p>\n<p>This is not a maintenance role. We are modernizing our analytics platform, upgrading orchestration infrastructure, building shared and reusable data models with conformed dimensions, establishing a certified metrics framework, and laying the foundation for AI-native data development. You will partner closely with Data Science, Data Infrastructure, Product Engineering, and Business Intelligence teams to make this happen.</p>\n<p>You will play a crucial role in establishing analytics engineering standards, designing scalable data models, and driving cross-functional alignment on data governance. You will get substantial exposure to senior leadership, shape the technical direction of analytics infrastructure at Dropbox, and directly influence how data powers product and business decisions.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Lead the design and implementation of shared, reusable data models, defining shared fact tables, conformed dimensions, and a semantic/metrics layer that serves as the single source of truth across analytics functions</li>\n</ul>\n<ul>\n<li>Drive standardization of data engineering practices across ADE and functional analytics teams, including pipeline patterns, CI/CD workflows, naming conventions, and data modeling standards</li>\n</ul>\n<ul>\n<li>Partner with Data Infrastructure to modernize orchestration, improve pipeline decomposition, and establish secure dev/test environments with production data access</li>\n</ul>\n<ul>\n<li>Architect and implement a shift-left data governance strategy, working with upstream data producers to establish data contracts, SLOs, and code-enforced quality gates that catch issues before production</li>\n</ul>\n<ul>\n<li>Collaborate with Data Science leads and Product Management to translate metric definitions into reliable, certified data pipelines that power executive dashboards, WBR reporting, and growth measurement</li>\n</ul>\n<ul>\n<li>Reduce operational burden by improving pipeline granularity, observability, and failure recovery, establishing runbooks and alerting standards that make on-call sustainable</li>\n</ul>\n<ul>\n<li>Evaluate and integrate AI-native tooling into the data development lifecycle, enabling conversational data exploration with guardrails and AI-assisted pipeline development</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>BS degree in Computer Science or related technical field, or equivalent technical experience</li>\n</ul>\n<ul>\n<li>12+ years of experience in data engineering or analytics engineering with increasing scope and technical leadership</li>\n</ul>\n<ul>\n<li>12+ years of SQL experience, including complex analytical queries, window functions, and performance optimization at scale (Spark SQL)</li>\n</ul>\n<ul>\n<li>8+ years of Python development experience, including building and maintaining production data pipelines</li>\n</ul>\n<ul>\n<li>Deep expertise in dimensional data modeling, schema design, and scalable data architecture, with hands-on experience building shared data models across multiple business domains</li>\n</ul>\n<ul>\n<li>Strong experience with orchestration tools (Airflow strongly preferred) and dbt, including pipeline design, scheduling strategies, and failure recovery patterns</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Experience with Databricks (Unity Catalog, Delta Lake) and modern lakehouse architectures</li>\n</ul>\n<ul>\n<li>Experience leading orchestration or platform modernization efforts at scale</li>\n</ul>\n<ul>\n<li>Familiarity with data governance and observability tools such as Atlan, Monte Carlo, Great Expectations, or similar</li>\n</ul>\n<ul>\n<li>Experience building or contributing to a metrics/semantic layer (dbt MetricFlow, Databricks Metric Views, or equivalent)</li>\n</ul>\n<ul>\n<li>Track record of establishing data engineering standards and best practices in a federated analytics organization</li>\n</ul>\n<p>Compensation:</p>\n<p>US Zone 2 $198,900-$269,100 USD</p>\n<p>US Zone 3 $176,800-$239,200 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_53ee0ef3-c62","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Dropbox","sameAs":"https://www.dropbox.com/","logo":"https://logos.yubhub.co/dropbox.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dropbox/jobs/7595183","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$198,900-$269,100 USD","x-skills-required":["SQL","Python","Dimensional data modeling","Schema design","Scalable data architecture","Orchestration tools","dbt"],"x-skills-preferred":["Databricks","Modern lakehouse architectures","Data governance and observability tools","Metrics/semantic layer"],"datePosted":"2026-04-18T15:56:35.190Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - US: Select locations"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, Dimensional data modeling, Schema design, Scalable data architecture, Orchestration tools, dbt, Databricks, Modern lakehouse architectures, Data governance and observability tools, Metrics/semantic layer","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":198900,"maxValue":269100,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_69e8923b-c16"},"title":"Senior Data Scientist","description":"<p>We&#39;re seeking a Senior Data Scientist to join our Research, Analytics &amp; Data Science (RAD) team. Our team uses data and insights to drive evidence-based decision-making, generating actionable insights about our customers, products, and business.</p>\n<p>As a Senior Data Scientist, you&#39;ll partner with product teams to help them identify important questions and answer those questions with data. You&#39;ll work closely with product managers, designers, and engineers to develop key product success metrics, set targets, measure results, and outcomes, and size opportunities.</p>\n<p>You&#39;ll design, build, and update end-to-end data pipelines, working closely with stakeholders to drive the collection of new data and the refinement of existing data sources and tables. You&#39;ll also partner closely with product researchers to build a holistic understanding of our customers, products, and business.</p>\n<p>Increasingly, you&#39;ll use AI-assisted tools to accelerate analysis, coding, and insight generation. You&#39;ll identify opportunities to automate your own workflows and reduce time spent on repetitive tasks. You&#39;ll build scalable data products that enable stakeholders to self-serve insights and raise the bar for how AI is used within RAD.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Partnering with product teams to help them identify important questions and answer those questions with data</li>\n<li>Working closely with product managers, designers, and engineers to develop key product success metrics, set targets, measure results, and outcomes, and size opportunities</li>\n<li>Designing, building, and updating end-to-end data pipelines</li>\n<li>Partnering closely with product researchers to build a holistic understanding of our customers, products, and business</li>\n<li>Using AI-assisted tools to accelerate analysis, coding, and insight generation</li>\n<li>Identifying opportunities to automate your own workflows and reduce time spent on repetitive tasks</li>\n<li>Building scalable data products that enable stakeholders to self-serve insights</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>5+ years of experience working with data to solve problems and drive evidence-based decisions</li>\n<li>Strong SQL skills and solid grounding in statistics</li>\n<li>Experience working closely with product teams</li>\n<li>Proven track record of delivering actionable insights that drive measurable impact with minimal supervision</li>\n<li>Strong product intuition, business acumen, and ability to connect analysis to strategy</li>\n<li>Excellent communication skills (technical and non-technical), with a focus on driving decisions and outcomes</li>\n<li>Strong ownership, curiosity, and growth mindset</li>\n<li>Experience with a scientific computing language (e.g., Python)</li>\n</ul>\n<p>Preferred skills include:</p>\n<ul>\n<li>Experience with data modeling and ETL pipelines (esp. dbt)</li>\n<li>Experience building internal tools, data products, or self-serve analytics capabilities</li>\n<li>Experience leveraging AI across the data workflow - from ideation and coding to analysis and communication</li>\n</ul>\n<p>Benefits include:</p>\n<ul>\n<li>Competitive salary and equity in a fast-growing start-up</li>\n<li>Unlimited access to Claude Code and best-in-class AI tools; experimentation &amp; building is encouraged &amp; celebrated</li>\n<li>We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen</li>\n<li>Regular compensation reviews - we reward great work</li>\n<li>Peace of mind with life assurance, as well as comprehensive health and dental insurance for you and your dependents</li>\n<li>Open vacation policy and flexible holidays so you can take time off when you need it</li>\n<li>Paid maternity leave, as well as 6 weeks paternity leave for fathers, to let you spend valuable time with your loved ones</li>\n<li>MacBooks are our standard, but we’re happy to get you whatever equipment helps you get your job done</li>\n</ul>\n<p>Experience Level: Senior Employment Type: Full-time Workplace Type: Hybrid Category: Engineering Industry: Technology Salary Range: Competitive salary and equity in a fast-growing start-up Required Skills: SQL, statistics, experience working with product teams, strong product intuition, business acumen, excellent communication skills, strong ownership, curiosity, and growth mindset, experience with a scientific computing language (e.g., Python) Preferred Skills: data modeling and ETL pipelines (esp. dbt), building internal tools, data products, or self-serve analytics capabilities, leveraging AI across the data workflow</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_69e8923b-c16","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Intercom","sameAs":"https://www.intercom.com/","logo":"https://logos.yubhub.co/intercom.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/intercom/jobs/7749323","x-work-arrangement":null,"x-experience-level":null,"x-job-type":null,"x-salary-range":null,"x-skills-required":["SQL","statistics","experience working with product teams","strong product intuition","business acumen","excellent communication skills","strong ownership","curiosity","growth mindset","experience with a scientific computing language (e.g., Python)"],"x-skills-preferred":["data modeling and ETL pipelines (esp. dbt)","building internal tools","data products","or self-serve analytics capabilities","leveraging AI across the data workflow"],"datePosted":"2026-04-18T15:56:25.055Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, England"}},"skills":"SQL, statistics, experience working with product teams, strong product intuition, business acumen, excellent communication skills, strong ownership, curiosity, growth mindset, experience with a scientific computing language (e.g., Python), data modeling and ETL pipelines (esp. dbt), building internal tools, data products, or self-serve analytics capabilities, leveraging AI across the data workflow"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_158a429c-4d8"},"title":"Senior Data Scientist - Product Analytics","description":"<p>We are seeking a Senior Data Scientist to join our Research, Analytics &amp; Data Science (RAD) team. The RAD team uses data and insights to drive evidence-based decision-making. We&#39;re a team of data scientists and product researchers who use data to unlock actionable insights about our customers, products, and business.</p>\n<p>As a Senior Data Scientist, you will partner with product teams to help them identify important questions and answer those questions with data. You will work closely with product managers, designers, and engineers to develop key product success metrics, set targets, measure results, and outcomes, and size opportunities.</p>\n<p>Your responsibilities will include designing, building, and updating end-to-end data pipelines, working closely with stakeholders to drive the collection of new data and the refinement of existing data sources and tables. You will also partner closely with product researchers to build a holistic understanding of our customers, products, and business.</p>\n<p>You will influence our product roadmap and product strategy through experimentation, exploratory analysis, and quantitative research. You will build and automate actionable models and dashboards, craft data stories, and share your findings and recommendations across R&amp;D and the broader company.</p>\n<p>You will drive and shape core RAD foundations and help us improve how the RAD org operates.</p>\n<p>We are looking for someone with 5+ years of experience working with data to solve problems and drive evidence-based decisions. You should have excellent SQL skills and experience of applying analytical and statistical approaches to problem-solving. You should also have a proven track record of initiating and delivering actionable analysis and insights that drive tangible impact with minimal supervision.</p>\n<p>Excellent communication skills (technical and non-technical) and a focus on driving impact are essential. A strong growth mindset and sense of ownership, innate passion, and curiosity are also required.</p>\n<p>Experience with a scientific computing language (such as R or Python) is necessary. Experience with BI/Visualization tools like Tableau, Superset, and Looker is a bonus. Experience working with product teams and leveraging AI tools to boost efficiency and creativity across the data science workflow is also desirable.</p>\n<p>We offer a competitive salary and equity in a fast-growing start-up. We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen. Regular compensation reviews, life assurance, comprehensive health and dental insurance, open vacation policy, flexible holidays, paid maternity leave, and 6 weeks paternity leave are also part of our benefits package.</p>\n<p>Our working policy is hybrid, with employees expected to be in the office at least three days per week. We have a radically open and accepting culture, avoiding divisive subjects to foster a safe and cohesive work environment for everyone.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_158a429c-4d8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Intercom","sameAs":"https://www.intercom.com/","logo":"https://logos.yubhub.co/intercom.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/intercom/jobs/6317929","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Analytical and statistical approaches","Scientific computing language (R or Python)","BI/Visualization tools (Tableau, Superset, Looker)","Product teams experience"],"x-skills-preferred":["AI tools","Data modeling and ETL pipelines","Communication skills (technical and non-technical)"],"datePosted":"2026-04-18T15:56:11.783Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, England"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Analytical and statistical approaches, Scientific computing language (R or Python), BI/Visualization tools (Tableau, Superset, Looker), Product teams experience, AI tools, Data modeling and ETL pipelines, Communication skills (technical and non-technical)"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3ca38cf1-c6d"},"title":"Senior Revenue Operations Analyst","description":"<p>Join Brex, the intelligent finance platform that enables companies to spend smarter and move faster in over 200 markets. As a Senior Revenue Operations Analyst, you will play a critical role in supporting our Post-Sales organization, driving Net Revenue Retention (NRR), expansion, and long-term customer value. You will work closely with cross-functional partners to improve visibility and execution across key motions such as renewals, upsell and cross-sell, churn prevention, and account health management.</p>\n<p>Responsibilities: Create and maintain dashboards that provide Post-Sales leaders and operations partners with clear visibility into business performance Proactively identify opportunities to improve reporting and deliver deeper analytical insights on retention, expansion, and account health Conduct deep-dive analyses to surface trends, risks, and growth opportunities across the Post-Sales business Support the design and implementation of operational improvements in partnership with Revenue Operations team members and cross-functional stakeholders Evaluate tool usage and effectiveness to optimize our Post-Sales tech stack Build and maintain capacity and funnel models to support planning and scenario analysis Support ad hoc data requests, including account deep dives and compensation tracking</p>\n<p>Requirements: 2+ years of experience in revenue operations, sales operations, business operations, sales finance, or similar; preferably at a fintech or SaaS company Strong problem-solving skills and collaborative approach to solutioning Excellent written and verbal communication, especially with complex concepts Analytical mindset with a high bar for data precision and accuracy Self-motivated and able to run with projects autonomously with minimal oversight Advanced spreadsheet and modeling skills Moderate SQL skills with experience in data visualization tools (Looker or Hex experience a plus) High familiarity with CRM data models (Salesforce preferred) Experience leveraging AI tools to improve analysis, reporting, or operational efficiency Strong attention to detail including in documentation</p>\n<p>Bonus Points: Experience with sales compensation design and tracking Knowledge of revenue recognition principles and finance systems Familiarity with Net Revenue Retention (NRR), churn analysis, and expansion metrics Experience working with product or usage data (e.g., product analytics or customer health scoring)</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3ca38cf1-c6d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8485386002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$114,808 to $143,510","x-skills-required":["Revenue operations","Sales operations","Business operations","Sales finance","Data analysis","Data visualization","SQL","CRM data models","AI tools"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:55:25.081Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Seattle, Washington, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"Revenue operations, Sales operations, Business operations, Sales finance, Data analysis, Data visualization, SQL, CRM data models, AI tools","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":114808,"maxValue":143510,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_cc9af01d-1b9"},"title":"Partner Business Systems & AI Operations Lead","description":"<p>The Partner Business Systems &amp; AI Operations Lead will own the foundation of the Claude Partner Network, including the Salesforce partner data model, the partner platform stack, and the integrations between them. This role will also define and own the partner data quality standard, administer the partner platform stack, and build and operate the AI automation layer across the partner workflow stack.</p>\n<p>Key responsibilities include owning the Salesforce partner data model end to end, administering the partner platform stack, defining and owning the partner data quality standard, partnering with the Business Process Manager to instrument every partner process, running access and configuration governance for partner systems, and building and operating the AI automation layer.</p>\n<p>The ideal candidate will have five or more years in revenue systems, partner systems, or business systems roles with hands-on Salesforce administration or architecture experience, and will be able to translate a program rule into a schema, a validation rule, and an entitlement flow without a detailed specification.</p>\n<p>Strong candidates may also have Salesforce Administrator or Platform App Builder certification, or experience with Experience Cloud or a PRM such as Impartner or Salesforce PRM, SQL fluency for data quality checks and ad hoc analysis, prior partner program or channel operations experience, and experience standing up a data quality program from the ground up.</p>\n<p>The annual compensation range for this role is $215,000-$300,000 USD.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_cc9af01d-1b9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.com/","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5191437008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$215,000-$300,000 USD","x-skills-required":["Salesforce administration","Salesforce architecture","Data modeling","Data quality","AI automation","Partner systems","Revenue systems","Business systems"],"x-skills-preferred":["Salesforce Administrator certification","Platform App Builder certification","Experience Cloud","PRM","SQL","Partner program operations","Channel operations"],"datePosted":"2026-04-18T15:55:01.134Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Salesforce administration, Salesforce architecture, Data modeling, Data quality, AI automation, Partner systems, Revenue systems, Business systems, Salesforce Administrator certification, Platform App Builder certification, Experience Cloud, PRM, SQL, Partner program operations, Channel operations","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":215000,"maxValue":300000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_40db054b-06d"},"title":"Senior Product Manager, Access","description":"<p>We&#39;re looking for a Senior Technical Product Manager to join our Access team within the Acuity Scheduling department. As a Senior Technical Product Manager for Acuity Scheduling, you&#39;ll own the systems that control how customers sign in, manage identity, and pay for the platform.</p>\n<p>This is a hybrid role working 3 days per week from our Aveiro office. You will report to the Group Product Manager on the Acuity Scheduling team.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Product Ownership: Act as the primary product owner for your team, setting the roadmap and priorities based on technical feasibility, user impact, and business goals.</li>\n<li>Technical Strategy &amp; Roadmap: Collaborate with engineers to define a scalable technical strategy that aligns with product goals, focusing on data architecture, systems design, and service-based solutions.</li>\n<li>Cross-functional Collaboration: Partner with engineering, data science, and UX teams to understand requirements, manage trade-offs, and deliver solutions that balance speed and scalability.</li>\n<li>Cross-organization Collaboration: Work directly with Squarespace Identity and Security teams to develop and realise a shared vision of a singular identity and authentication system.</li>\n<li>Stakeholder Communication: Translate technical architecture and system requirements into clear, actionable items for stakeholders across the company, including senior leadership and non-technical teams.</li>\n<li>Quality, Security &amp; Performance Optimisation: Focus on the system&#39;s stability, reliability, and scalability by working closely with the engineering team on continuous improvement, platform security and technical debt management.</li>\n<li>Architecture Oversight: Guide architectural decisions to ensure optimal security, data flow, storage, and access within our product ecosystem. Advocate for sustainable choices in a service-oriented approach to component-based architecture.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Experience: 4-6 years in product management or a technical role with product ownership, preferably within a data-driven environment; ideally in an identity-centric role.</li>\n<li>Technical Expertise: Strong background as the technical product lead on teams owning data architecture, systems design, and service-based architecture (e.g., microservices), with the ability to engage deeply in technical discussions and decisions.</li>\n<li>Systems Thinking: Proven experience in end-to-end system thinking and design, including a strong grasp of component-based architectures, data storage options, and integration layers.</li>\n<li>Data Architecture: Hands-on experience with data modeling, database design, and data warehousing principles, including familiarity with large data model improvement initiatives.</li>\n<li>APIs &amp; Integration: Understanding of RESTful API design, OAuth, identity federation, and integration patterns to ensure seamless interoperability between services and systems.</li>\n<li>Analytical Mindset: Proficiency in using data to inform decisions autonomously, including experience with data analysis and product analytics tools.</li>\n<li>Communication Skills: Ability to communicate complex technical concepts to both technical and non-technical audiences, bridging the gap between product vision and technical execution.</li>\n<li>Agile Experience: Familiarity with Agile methodologies, including backlog management, sprint planning, and cross-functional team collaboration.</li>\n<li>Project Management: Familiarity with project management tools like Jira, Asana, or similar.</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Technical Transformation: Experience leading teams and/or organisations from a monolithic architecture to a service-oriented one.</li>\n<li>Identity Experience: High-level understanding of and experience with core user registration flows and OAuth as it pertains to user identity needs.</li>\n<li>Technical Documentation: Experience documenting technical data architecture, service flows, and system dependencies to ensure alignment and knowledge-sharing within the team</li>\n</ul>\n<p><strong>Benefits &amp; Perks</strong></p>\n<ul>\n<li>Health insurance with 100% covered premiums for you, your spouse or partner, and dependent children, including medical, dental, and vision</li>\n<li>Life and Disability Insurance</li>\n<li>Pension benefits with employer match</li>\n<li>Fertility and adoption benefits</li>\n<li>Headspace mindfulness app subscription</li>\n<li>Global Employee Assistance Program</li>\n<li>Statutory paid time off and all statutory leaves, as required</li>\n<li>Meal Allowance and Flex Benefits Account</li>\n<li>Employee donation match to community organisations</li>\n<li>In the easily accessible city centre of Aveiro</li>\n<li>7 Global Employee Resource Groups (ERGs)</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_40db054b-06d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Squarespace","sameAs":"https://www.squarespace.com/about/careers","logo":"https://logos.yubhub.co/squarespace.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/squarespace/jobs/7698954","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data architecture","systems design","service-based architecture","microservices","data modeling","database design","data warehousing","RESTful API design","OAuth","identity federation","integration patterns","data analysis","product analytics tools","Agile methodologies","backlog management","sprint planning","cross-functional team collaboration","project management tools","Jira","Asana"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:54:56.645Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Aveiro"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data architecture, systems design, service-based architecture, microservices, data modeling, database design, data warehousing, RESTful API design, OAuth, identity federation, integration patterns, data analysis, product analytics tools, Agile methodologies, backlog management, sprint planning, cross-functional team collaboration, project management tools, Jira, Asana"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_704c1e4a-ec2"},"title":"Engineering Manager, Guest & Host","description":"<p>We are seeking an experienced Engineering Manager to join our Guest &amp; Host Services team. As an Engineering Manager, you will lead a team of talented software developers to prototype, build and improve hosting services. Your responsibilities will include exploring new product experiences, leading investments into new technical capabilities, collaborating with engineers to plan and sequence the delivery of new features and capabilities, and partnering with peer engineering teams to integrate new features that impact different areas within the product.</p>\n<p>You will also be responsible for recruiting and nurturing exceptional engineering talent, fostering an inclusive team culture and environment that encourages collaboration, technical excellence, and innovation, and making sure the team&#39;s partnerships with key stakeholders in Engineering, Design, and Product are strong.</p>\n<p>To be successful in this role, you will need to have 5+ years of engineering management experience, with 9+ years of relevant software development industry experience in a fast-paced tech environment. You will also need to have deep expertise with backend systems and complex data modeling in large-scale consumer applications, excellent communication and presentation skills, and an end-to-end, product-oriented mentality that transcends team boundaries and helps find globally optimal solutions.</p>\n<p>As a member of our team, you will have the opportunity to work on a wide range of projects and contribute to the development of new features and capabilities that will help us achieve our mission of making it easier for people to host on Airbnb.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_704c1e4a-ec2","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Airbnb","sameAs":"https://www.airbnb.com/","logo":"https://logos.yubhub.co/airbnb.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/airbnb/jobs/7532824","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"£120,000-£150,000 GBP","x-skills-required":["backend systems","complex data modeling","large-scale consumer applications","engineering management","software development"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:54:54.506Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, United Kingdom"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"backend systems, complex data modeling, large-scale consumer applications, engineering management, software development","baseSalary":{"@type":"MonetaryAmount","currency":"GBP","value":{"@type":"QuantitativeValue","minValue":120000,"maxValue":150000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3917fb4f-2ab"},"title":"Full Stack Software Engineer","description":"<p>We are looking for a talented full stack software engineer to join our growing team at Anduril Labs in Washington, DC.</p>\n<p>As a full stack software engineer in Anduril Labs, you will help bring innovative, next-generation concepts to life through proof-of-concept development and rapid prototyping using bleeding edge technologies.</p>\n<p>The ideal candidate has exceptional software development and creative problem-solving skills, is a self-starter, and can quickly grasp complex concepts.</p>\n<p>As a full stack software engineer, you possess the skills to architect, develop, and deploy distributed applications and services, including both front-end and back-end components.</p>\n<p>You have experience with agile, end-to-end software development lifecycle and are comfortable developing and deploying code across Windows and Linux-based systems (including standalone bare-metal hardware, virtualized environments, and cloud-hosted platforms).</p>\n<p>Embedded software development experience is a plus.</p>\n<p>You are also proficient in integrating legacy code and systems, leveraging open-source technologies, and developing and utilizing APIs.</p>\n<p>Additionally, you have a solid understanding of AI/ML core concepts (e.g., feature extraction, supervised vs. unsupervised learning, regression, classification, clustering, deep learning neural networks, NLP, LLMs, SLMs, model fine-tuning, prompt engineering, RAG) and hands-on experience developing (Gen)AI-enhanced applications or services.</p>\n<p>We also expect candidates to have familiarity with database technologies (e.g., SQL, NoSQL, Graph DB, Vector DB) and experience with data modeling, data wrangling, analytics, and visualization.</p>\n<p>Since Anduril Labs supports all Anduril businesses and product lines, you will have the unique opportunity to work closely with multi-disciplinary engineering and product development teams across the entire company.</p>\n<p>This means you will get to directly contribute to the development of Anduril’s next-generation products and services.</p>\n<p>So if you thrive in a dynamic environment that values creative problem-solving, love writing code, excel as both an individual contributor and team player, are eager to learn, and bring a can-do attitude, this role is for you.</p>\n<p><strong>Key Responsibilities:</strong></p>\n<ul>\n<li>Lead the development of prototypes to demonstrate advanced concepts in areas like autonomous and multi-agent systems, GenAI, advanced data analytics, quantum computing/sensing/networking/comms/machine learning, modeling, simulation, optimization, visualization, next-gen human-machine interfaces, heterogenous computing, and cybersecurity.</li>\n</ul>\n<ul>\n<li>Own the entire Software Development Lifecycle from inception through development, testing, deployment, and documentation for Anduril Labs-developed software prototypes.</li>\n</ul>\n<ul>\n<li>Interface and collaborate with other Anduril and customer engineering teams, and strategic partners.</li>\n</ul>\n<ul>\n<li>Support Anduril- and customer-funded R&amp;D efforts.</li>\n</ul>\n<ul>\n<li>Participate in field experiments and technology demonstrations.</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>3+ years of programming with Python, C++, Java, Rust, Go, or JavaScript/TypeScript.</li>\n</ul>\n<ul>\n<li>Proven software architecture and design skills.</li>\n</ul>\n<ul>\n<li>Ability to quickly understand and navigate complex systems and established codebases.</li>\n</ul>\n<ul>\n<li>AI/ML development using commercial and open-source AI frameworks, models, and tools (e.g., Jupyter Notebook, PyTorch, TensorFlow, Scikit-learn, OpenAI, Claude, Gemini, Llama, LangChain, YOLO, AWS Sagemaker, Bedrock, Azure AI, RAG).</li>\n</ul>\n<ul>\n<li>Web app development (e.g., React, Angular, or Vue).</li>\n</ul>\n<ul>\n<li>Cloud development (e.g., AWS, Azure, or GCP).</li>\n</ul>\n<ul>\n<li>Data modeling and wrangling.</li>\n</ul>\n<ul>\n<li>Networking basics (e.g., DNS, TCP/IP vs. UDP, socket communications, LDAP, Active Directory).</li>\n</ul>\n<ul>\n<li>Database technologies (e.g., SQL, NoSQL, Graph DB, Vector DB).</li>\n</ul>\n<ul>\n<li>API development and integration (e.g., REST, GraphQL).</li>\n</ul>\n<ul>\n<li>Containerization technologies (e.g., Docker, Kubernetes).</li>\n</ul>\n<ul>\n<li>Software development on Linux and Windows.</li>\n</ul>\n<ul>\n<li>Demonstrable hands-on experience using GenAI tools (e.g., OpenAI Codex, Claude Code, Gemini Code Assist, GitHub Copilot, Amazon CodeWhisperer, or similar) for software development, code generation, debugging, and algorithmic exploration.</li>\n</ul>\n<ul>\n<li>Experience with Git version control, build tools, and CI/CD pipelines.</li>\n</ul>\n<ul>\n<li>Demonstrated understanding and application of software testing principles and practices, including unit testing, integration testing, and end-to-end testing.</li>\n</ul>\n<ul>\n<li>Strong problem-solving skills, meticulous attention to detail, and the ability to work effectively in a collaborative team environment.</li>\n</ul>\n<ul>\n<li>Excellent communication and interpersonal skills, with the ability to effectively articulate complex technical concepts to diverse audiences.</li>\n</ul>\n<ul>\n<li>Eligible to obtain and maintain an active U.S. Top Secret SCI security clearance.</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>BS in Computer Science, Engineering, or similar field.</li>\n</ul>\n<ul>\n<li>Distributed applications development (e.g., client/server, microservices, multi-agent solutions).</li>\n</ul>\n<ul>\n<li>High performance computing (HPC) and big data technologies (e.g., Apache Spark, Hadoop).</li>\n</ul>\n<ul>\n<li>Mobile app development (e.g., iOS or Android).</li>\n</ul>\n<ul>\n<li>Embedded software development experience.</li>\n</ul>\n<ul>\n<li>Willingness to travel up to approximately 10% US</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3917fb4f-2ab","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5089044007","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$132,000-$198,000 USD","x-skills-required":["Python","C++","Java","Rust","Go","JavaScript/TypeScript","Software Architecture","AI/ML","Web App Development","Cloud Development","Data Modeling","Networking","Database Technologies","API Development","Containerization","Git Version Control","Build Tools","CI/CD Pipelines","Unit Testing","Integration Testing","End-to-End Testing"],"x-skills-preferred":["Distributed Applications Development","High Performance Computing","Big Data Technologies","Mobile App Development","Embedded Software Development"],"datePosted":"2026-04-18T15:54:45.879Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, District of Columbia, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, C++, Java, Rust, Go, JavaScript/TypeScript, Software Architecture, AI/ML, Web App Development, Cloud Development, Data Modeling, Networking, Database Technologies, API Development, Containerization, Git Version Control, Build Tools, CI/CD Pipelines, Unit Testing, Integration Testing, End-to-End Testing, Distributed Applications Development, High Performance Computing, Big Data Technologies, Mobile App Development, Embedded Software Development","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":132000,"maxValue":198000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3a17bc01-d7d"},"title":"Staff Software Engineer","description":"<p>DBT Labs is seeking a Staff Software Engineer to join our Engineering team. As a seasoned engineer, you will architect and build the durable memory substrate that powers agentic analytics workflows. This platform stores not just metadata, but meaning: decisions, intent, rationale, and history , and makes it safely accessible to humans, agents, and applications.</p>\n<p>Your responsibilities will include:</p>\n<ul>\n<li>Prototyping apt technical solutions and finding best fits for the context engine.</li>\n<li>Architecting and building the core Context Platform.</li>\n<li>Designing schemas and primitives for Decision Memory and enterprise context.</li>\n<li>Owning context storage systems (graph, vector, event/time-based).</li>\n<li>Building read/write/query APIs used by agents, products, and external apps.</li>\n<li>Designing permission-aware, auditable context access.</li>\n</ul>\n<p>You will be working closely with agentic systems engineers and product leadership to ensure the context engine is interoperable, portable, and zero-lock-in by design.</p>\n<p>In this role, you will own:</p>\n<ul>\n<li>Context schemas and schema evolution strategies.</li>\n<li>Storage and data modeling choices.</li>\n<li>Platform APIs and interfaces.</li>\n<li>Security, identity propagation, and audit foundations.</li>\n<li>Long-term scalability and correctness of context data.</li>\n</ul>\n<p>You will not own:</p>\n<ul>\n<li>Agent behavior or orchestration logic.</li>\n<li>Business rules or governance policy decisions.</li>\n<li>Product UI or workflow automation.</li>\n</ul>\n<p>The ideal candidate will have significant experience building distributed systems, data platforms, or infrastructure, and will be comfortable operating in ambiguous, greenfield problem spaces. They will also have deep expertise in data modeling and schema design, experience designing shared platforms used by many teams, and strong instincts around APIs, contracts, and backward compatibility.</p>\n<p>Nice to have experience with knowledge graphs, metadata systems, or search/retrieval systems, experience building systems with governance, auditability, or compliance requirements, and familiarity with dbt or modern analytics stacks or developer tooling.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3a17bc01-d7d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4661362005","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Distributed systems","Data platforms","Infrastructure","Data modeling","Schema design","APIs","Contracts","Backward compatibility","Knowledge graphs","Metadata systems","Search/retrieval systems"],"x-skills-preferred":["dbt","Modern analytics stacks","Developer tooling"],"datePosted":"2026-04-18T15:54:01.444Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"India - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Distributed systems, Data platforms, Infrastructure, Data modeling, Schema design, APIs, Contracts, Backward compatibility, Knowledge graphs, Metadata systems, Search/retrieval systems, dbt, Modern analytics stacks, Developer tooling"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_584a7343-f48"},"title":"Senior Revenue Strategy & Operations Manager","description":"<p>About Mixpanel</p>\n<p>Mixpanel is a digital analytics platform that helps companies understand user behavior and track company success metrics.</p>\n<p>The Revenue Strategy &amp; Operations team at Mixpanel partners with Regional Business Leaders &amp; Global Leaders to help set and execute global and regional revenue strategies.</p>\n<p>About the Role</p>\n<p>As Senior Revenue Strategy &amp; Operations Manager, you&#39;ll support on defining the strategy behind customer retention and account growth. You&#39;ll partner closely with CS and Sales leadership to improve Gross and Net Revenue Retention, enable scalable expansion motions, and proactively identify areas to enhance the customer experience and lifetime value.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Drive strategic initiatives that shape how we engage, retain, and grow our customer base</li>\n</ul>\n<ul>\n<li>Conduct advanced market, customer, and product analyses to uncover whitespace opportunities, inform strategic bets, and guide resource allocation</li>\n</ul>\n<ul>\n<li>Own and evolve the reporting framework for sales and post-sales KPIs,translating insights into recommendations that shape our post-sales strategy</li>\n</ul>\n<ul>\n<li>Act as a strategic thought partner to GTM leadership,bringing analytical rigor and business acumen to inform key decisions and go-to-market priorities</li>\n</ul>\n<ul>\n<li>Provide high-impact operational support to the Sales organization, proactively identifying bottlenecks and implementing scalable solutions to improve performance</li>\n</ul>\n<ul>\n<li>Run and evolve key operating cadences, including business performance deep dives, and executive-level business reviews across post-sales teams</li>\n</ul>\n<ul>\n<li>Collaborate cross-functionally with Product, Sales, Finance, and Marketing to ensure alignment and shared ownership of customer outcomes throughout the full lifecycle</li>\n</ul>\n<p>We&#39;re Looking for Someone Who Has</p>\n<ul>\n<li>4+ years of experience at a top-tier Management or Strategy Consulting firm</li>\n</ul>\n<ul>\n<li>2+ years of operating experience in Corporate Strategy, Business Operations, or Sales Strategy in a high-growth, fast-paced environment at a B2B SaaS organization</li>\n</ul>\n<ul>\n<li>Experience defining and operationalizing GTM Strategy and proven experience driving impact on retention and expansion metrics (GRR, NRR, etc.)</li>\n</ul>\n<ul>\n<li>Strong project management experience with demonstrated ability to effectively manage time, prioritize tasks, and work within deadlines</li>\n</ul>\n<ul>\n<li>Track-record of collaborating with cross-functional partners to execute projects</li>\n</ul>\n<ul>\n<li>Highly motivated, innate intellectual curiosity, and a strong desire to drive impact</li>\n</ul>\n<ul>\n<li>Excellent problem-solving skills, and the ability to thrive in a fast-paced, dynamic environment</li>\n</ul>\n<ul>\n<li>Outstanding written and oral communication skills</li>\n</ul>\n<p>Bonus Points For</p>\n<ul>\n<li>Working knowledge of SFDC including Reporting and Record Management from lead creation through opportunity closure</li>\n</ul>\n<ul>\n<li>Proficiency in modeling and analyzing complex and large data sets</li>\n</ul>\n<ul>\n<li>Demonstrable passion for the data analytics industry</li>\n</ul>\n<p>Compensation</p>\n<p>The amount listed below is the total target cash compensation (TTCC) and includes base compensation and variable compensation in the form of either a company bonus or commissions. Variable compensation type is determined by your role and level. In addition to the cash compensation provided, this position is also eligible for equity consideration and other benefits including medical, vision, and dental insurance coverage.</p>\n<p>Our salary ranges are determined by role and level and are benchmarked to the SF Bay Area Technology data cut released by Radford, a global compensation database. The range displayed represents the minimum and maximum TTCC for new hire salaries for the position across all of our US locations. To stay on top of market conditions, we refresh our salary ranges twice a year so these ranges may change in the future. Within the range, individual pay is determined by experience, job-related skills, qualifications, and other factors. If you have questions about the specific range, your recruiter can share this information.</p>\n<p>Mixpanel Compensation Range $189,000-$231,000 USD</p>\n<p>Benefits and Perks</p>\n<ul>\n<li>Comprehensive Medical, Vision, and Dental Care</li>\n</ul>\n<ul>\n<li>Mental Wellness Benefit</li>\n</ul>\n<ul>\n<li>Generous Vacation Policy &amp; Additional Company Holidays</li>\n</ul>\n<ul>\n<li>Enhanced Parental Leave</li>\n</ul>\n<ul>\n<li>Volunteer Time Off</li>\n</ul>\n<ul>\n<li>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</li>\n</ul>\n<p>Culture Values</p>\n<ul>\n<li>Make Bold Bets: We choose courageous action over comfortable progress.</li>\n</ul>\n<ul>\n<li>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience, and collective wisdom to drive powerful outcomes.</li>\n</ul>\n<ul>\n<li>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</li>\n</ul>\n<ul>\n<li>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</li>\n</ul>\n<ul>\n<li>Champion the Customer: We seek to deeply understand our customers&#39; needs, ensuring their success is our north star.</li>\n</ul>\n<ul>\n<li>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</li>\n</ul>\n<p>Why choose Mixpanel?</p>\n<p>We&#39;re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital. Mixpanel&#39;s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics. Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service. Choosing to work at Mixpanel means you&#39;ll be helping the world&#39;s most innovative companies learn from their data so they can make better decisions.</p>\n<p>Mixpanel is an equal opportunity employer supporting workforce diversity. At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have. We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply. We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, veteran status, or disability status. Pursuant to the San Francisco Fair Chance Ordinance or other similar laws that may be applicable, we will consider for employment qualified applicants with arrest and conviction records. We&#39;ve immersed ourselves in our Culture and Values as our guiding principles for the impact we want to have and the future we are building.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_584a7343-f48","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mixpanel","sameAs":"https://mixpanel.com","logo":"https://logos.yubhub.co/mixpanel.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mixpanel/jobs/7008408","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$189,000-$231,000 USD","x-skills-required":["digital analytics","data analysis","project management","strategic planning","business operations","sales strategy","customer retention","account growth","Gross and Net Revenue Retention","expansion metrics","SFDC","Reporting and Record Management"],"x-skills-preferred":["data modeling","data visualization","data science","machine learning","cloud computing","cybersecurity"],"datePosted":"2026-04-18T15:53:55.303Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, US (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Operations","industry":"Technology","skills":"digital analytics, data analysis, project management, strategic planning, business operations, sales strategy, customer retention, account growth, Gross and Net Revenue Retention, expansion metrics, SFDC, Reporting and Record Management, data modeling, data visualization, data science, machine learning, cloud computing, cybersecurity","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":189000,"maxValue":231000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_165c3a6f-f1e"},"title":"Data Engineer, Analytics","description":"<p>We are looking for an experienced Data Engineer, Analytics to join our data team. As a Data Engineer, Analytics, you will be responsible for owning the transformation and semantic layer that turns data into clean, tested, well-documented tables and dashboards that data scientists, product managers, and business stakeholders can trust and self-serve from.</p>\n<p>You will define and operationalize the metrics that inform how we identify opportunities, measure success, and make decisions. This includes designing, building, and maintaining curated analytical datasets and data models that serve as the canonical sources for metrics, dashboards, and analyses.</p>\n<p>You will partner closely with data science, product managers, and engineering teams to translate business questions into well-modeled, performant, and discoverable data assets. You will execute metric workflows, from metric definition and logging schema design to data modeling and visualization, with guidance from manager and senior team members.</p>\n<p>You will also build and maintain executive-level dashboards and self-serve reporting tools that enable business stakeholders to answer their own questions.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, build, and maintain curated analytical datasets and data models that serve as the canonical sources for metrics, dashboards, and analyses.</li>\n</ul>\n<ul>\n<li>Partner closely with data science, product managers, and engineering teams to translate business questions into well-modeled, performant, and discoverable data assets.</li>\n</ul>\n<ul>\n<li>Execute metric workflows, from metric definition and logging schema design to data modeling and visualization, with guidance from manager and senior team members.</li>\n</ul>\n<ul>\n<li>Build and maintain executive-level dashboards and self-serve reporting tools that enable business stakeholders to answer their own questions.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>2+ years of experience in analytics or data engineering with a strong focus on building curated, consumer-facing datasets.</li>\n</ul>\n<ul>\n<li>2+ years of experience in designing, developing, and maintaining robust data models from structured and unstructured sources to power a variety of use cases, including experimentation.</li>\n</ul>\n<ul>\n<li>2+ years of experience writing accurate and effective SQL.</li>\n</ul>\n<ul>\n<li>Fluency in Python or another programming language.</li>\n</ul>\n<ul>\n<li>Experience building and owning executive-level dashboards and reports using BI tools (e.g., Looker, Tableau, or similar).</li>\n</ul>\n<ul>\n<li>Strong business acumen, you will partner with data scientists and product managers to translate ambiguous questions into concrete metric definitions and data models.</li>\n</ul>\n<ul>\n<li>Excellent communication, comfortable being the connective tissue between technical and business teams.</li>\n</ul>\n<p>Bonus Points:</p>\n<ul>\n<li>Passion for Discord or online communities.</li>\n</ul>\n<ul>\n<li>Experience building or contributing to a semantic layer or metrics store.</li>\n</ul>\n<ul>\n<li>Experience with modern analytics and data engineering tools (dbt, BigQuery etc).</li>\n</ul>\n<ul>\n<li>Experience implementing and monitoring audits for data quality with massive data sets (e.g. billions of rows).</li>\n</ul>\n<ul>\n<li>Experience working on SEO, GEO, or other top of funnel growth focused features.</li>\n</ul>\n<ul>\n<li>Experience collaborating with compliance, legal, or litigation cross-functional teams.</li>\n</ul>\n<p>The US base salary range for this full-time position is $160,000 to $180,000 + equity + benefits.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_165c3a6f-f1e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Discord","sameAs":"https://discord.com/","logo":"https://logos.yubhub.co/discord.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/discord/jobs/8371252002","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$160,000 to $180,000 + equity + benefits","x-skills-required":["data engineering","analytics","SQL","Python","BI tools","data modeling","metric definition","data visualization"],"x-skills-preferred":["semantic layer","metrics store","modern analytics tools","data quality audits","SEO","GEO","compliance","legal","litigation"],"datePosted":"2026-04-18T15:53:32.713Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco Bay Area"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, analytics, SQL, Python, BI tools, data modeling, metric definition, data visualization, semantic layer, metrics store, modern analytics tools, data quality audits, SEO, GEO, compliance, legal, litigation","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":160000,"maxValue":180000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_58a44dab-91a"},"title":"Partner Solutions Architect - Japan","description":"<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across Japan. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>You will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud.</p>\n<p>Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing. This is not a purely reactive enablement role. The Partner SA is expected to help shape and execute repeatable partner plays that create revenue.</p>\n<p>That includes enabling partner sellers and architects, supporting account mapping and seller-to-seller engagement, helping define joint value propositions, supporting partner-led pipeline generation, and influencing product and field strategy based on what is learned in-market.</p>\n<p>Internal operating docs show this motion consistently includes enablement sessions, QBR sponsorships, account planning, workshops, field events, and targeted campaigns designed to produce sourced and influenced pipeline.</p>\n<p>You&#39;ll be part of a team helping dbt scale its ecosystem through better partner capability, tighter field alignment, and more repeatable pipeline generation. The role is especially important as dbt continues investing in structured partner motions and deeper engagement with major cloud and data platform partners.</p>\n<p>What you&#39;ll do:</p>\n<ul>\n<li>Partner closely with Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n</ul>\n<ul>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n</ul>\n<ul>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n</ul>\n<ul>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n</ul>\n<ul>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n</ul>\n<ul>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n</ul>\n<ul>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n</ul>\n<ul>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n</ul>\n<ul>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n</ul>\n<ul>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n</ul>\n<ul>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<ul>\n<li>Travel approximately 30-40% to support partner planning, enablement, executive meetings, and field events across Japan</li>\n</ul>\n<p>This scope reflects how the Partner SA team is already operating: enabling partner field teams, building account-level alignment, supporting QBRs and regional events, and translating those activities into sourced and engaged pipeline.</p>\n<p>What you&#39;ll need:</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n</ul>\n<ul>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n</ul>\n<ul>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n</ul>\n<ul>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n</ul>\n<ul>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n</ul>\n<ul>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n</ul>\n<ul>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n</ul>\n<ul>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n</ul>\n<ul>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n</ul>\n<ul>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out:</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n</ul>\n<ul>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n</ul>\n<ul>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n</ul>\n<ul>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n</ul>\n<ul>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n</ul>\n<ul>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n</ul>\n<ul>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n</ul>\n<ul>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>What to expect in the interview process (all video interviews unless accommodations are needed):</p>\n<ul>\n<li>Interview with Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Interview with Hiring Manager</li>\n</ul>\n<ul>\n<li>Team Interviews</li>\n</ul>\n<ul>\n<li>Demo Round</li>\n</ul>\n<p>#LI-LA1</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_58a44dab-91a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673657005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner engineering","customer-facing technical role"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:53:29.744Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Japan - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner engineering, customer-facing technical role, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_be766cd7-8e2"},"title":"Staff Software Engineer, Backend (Iasi)","description":"<p>We are excited to expand our operations to Romania and build a tech hub in the region. As a Staff full-stack engineer, with a backend focus, you will be at the forefront of shaping the future of customer engagement! You&#39;ll be instrumental in delivering timely, actionable insights that drive business growth from day one.</p>\n<p>We&#39;re building a state-of-the-art Customer Data Platform, visualizing relevant insights for businesses post-onboarding and guiding customer engagement across all touch-points. Be part of the team that&#39;s redefining the way businesses connect with their customers!</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, implement, and maintain backend services and APIs to support applications.</li>\n<li>Build and optimize data storage solutions using Postgres, ClickHouse and Elasticsearch to ensure high performance and scalability.</li>\n<li>Collaborate with cross-functional teams, including frontend engineers, data scientists, and machine learning engineers, to deliver end-to-end solutions.</li>\n<li>Monitor and troubleshoot performance issues in distributed systems and databases.</li>\n<li>Write clean, maintainable, and efficient code following best practices for backend development.</li>\n<li>Participate in code reviews, testing, and continuous integration efforts.</li>\n<li>Ensure security, scalability, and reliability of backend services.</li>\n<li>Analyze and improve system architecture, focusing on performance bottlenecks, scaling, and security.</li>\n</ul>\n<p>Qualifications We Value:</p>\n<ul>\n<li>Proven experience as a Backend Engineer with a focus on database design and system architecture.</li>\n<li>Strong expertise in ClickHouse or similar columnar databases for managing large-scale, real-time analytical queries.</li>\n<li>Hands-on experience with Elasticsearch for indexing and searching large datasets.</li>\n<li>Proficient in backend programming languages such as Python, Go.</li>\n<li>Experience with RESTful API design and development.</li>\n<li>Solid understanding of distributed systems, microservices architecture, and cloud infrastructure.</li>\n<li>Experience with performance tuning, data modeling, and query optimization.</li>\n<li>Strong problem-solving skills and attention to detail.</li>\n<li>Excellent communication and teamwork abilities.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_be766cd7-8e2","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Cresta","sameAs":"https://www.cresta.ai/","logo":"https://logos.yubhub.co/cresta.ai.png"},"x-apply-url":"https://job-boards.greenhouse.io/cresta/jobs/5030292008","x-work-arrangement":"hybrid","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Backend Engineer","Database design","System architecture","ClickHouse","Elasticsearch","Python","Go","RESTful API design","Distributed systems","Microservices architecture","Cloud infrastructure","Performance tuning","Data modeling","Query optimization"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:52:36.898Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Iasi, Romania (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Backend Engineer, Database design, System architecture, ClickHouse, Elasticsearch, Python, Go, RESTful API design, Distributed systems, Microservices architecture, Cloud infrastructure, Performance tuning, Data modeling, Query optimization"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_47807ca3-e36"},"title":"Strategic AI/BI Account Executive","description":"<p>We are seeking a Strategic AI/BI Account Executive to help enterprise customers transform how business users interact with data. This high-impact role sits within the AI Go-To-Market team and partners closely with Enterprise Account Executives to drive adoption of Databricks AI/BI and Genie in APJ.</p>\n<p>You will help organisations move beyond static dashboards to governed, conversational, AI-powered analytics at the centre of the convergence of business intelligence, data platforms, and generative AI. Enterprise analytics is rapidly evolving from dashboards and static reporting to conversational, AI-driven decision platforms. Databricks AI/BI and Genie empower business users to securely interact with governed data using natural language, transforming the data platform into a true decision platform.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Partner with Enterprise AEs to identify, qualify, and close AI/BI opportunities</li>\n<li>Engage C-level, analytics, and line-of-business leaders to modernise analytics strategies</li>\n<li>Displace or expand legacy BI platforms with AI-powered, governed analytics solutions</li>\n<li>Lead conversations around semantic governance, self-service analytics, and natural language data access</li>\n<li>Drive proof-of-value engagements and scale enterprise-wide adoption</li>\n<li>Align AI/BI initiatives to measurable business outcomes (productivity, speed to insight, revenue impact)</li>\n<li>Enable field teams and serve as a subject matter expert on modern analytics architectures</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>Enterprise sales experience in BI, analytics, data platforms, or AI/ML</li>\n<li>Strong understanding of modern analytics architectures and data governance</li>\n<li>Ability to sell to both technical and business stakeholders</li>\n<li>Executive presence and experience navigating complex buying cycles</li>\n<li>Passion for AI and the impact of GenAI on enterprise analytics</li>\n<li>Experience operating in a specialist or overlay sales model</li>\n<li>Ability to translate technical capabilities into clear business value</li>\n<li>7+ years of Enterprise Sales experience, exceeding quotas in larger accounts</li>\n</ul>\n<p>Preferred qualifications include:</p>\n<ul>\n<li>Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot</li>\n<li>Familiarity with semantic layers, metrics stores, or governed data models</li>\n<li>Understanding of lakehouse architectures and cloud data platforms</li>\n<li>Exposure to GenAI, natural language interfaces, or conversational applications</li>\n<li>Consulting or solution design experience in customer-facing roles</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_47807ca3-e36","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8441884002","x-work-arrangement":"onsite","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Enterprise sales experience in BI, analytics, data platforms, or AI/ML","Strong understanding of modern analytics architectures and data governance","Ability to sell to both technical and business stakeholders","Executive presence and experience navigating complex buying cycles","Passion for AI and the impact of GenAI on enterprise analytics"],"x-skills-preferred":["Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot","Familiarity with semantic layers, metrics stores, or governed data models","Understanding of lakehouse architectures and cloud data platforms","Exposure to GenAI, natural language interfaces, or conversational applications","Consulting or solution design experience in customer-facing roles"],"datePosted":"2026-04-18T15:52:23.856Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Singapore"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Enterprise sales experience in BI, analytics, data platforms, or AI/ML, Strong understanding of modern analytics architectures and data governance, Ability to sell to both technical and business stakeholders, Executive presence and experience navigating complex buying cycles, Passion for AI and the impact of GenAI on enterprise analytics, Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot, Familiarity with semantic layers, metrics stores, or governed data models, Understanding of lakehouse architectures and cloud data platforms, Exposure to GenAI, natural language interfaces, or conversational applications, Consulting or solution design experience in customer-facing roles"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a738803a-64f"},"title":"Head of Enterprise Marketing Strategy & Analytics","description":"<p><strong>About the Role</strong></p>\n<p>This foundational leadership role will build and lead the Enterprise Marketing Strategy &amp; Analytics function, serving as the operating system for a rapidly scaling marketing organisation. The primary mandate is to define and measure success across all marketing programmes,from demand generation (field events, ABM, EBCs, partner co-marketing) to pipeline contribution,creating a clear line from investment to pipeline to revenue.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Define and own the Enterprise Marketing measurement framework, targets, and reporting, covering the full funnel from top-of-funnel demand through pipeline influence and closed-won attribution.</li>\n<li>Build and maintain core analytics infrastructure (data models, attribution logic, dashboards) in partnership with Revenue Operations and Data Science, ensuring marketing and sales alignment on key metrics.</li>\n<li>Serve as one of the primary operating partner to Finance, HR, and Recruiting, leading budget tracking, headcount planning, and vendor management.</li>\n<li>Partner with marketing leadership and the central Marketing Ops &amp; Strategy team on annual and quarterly planning, resource allocation, and performance reviews.</li>\n<li>Establish the operating cadence for Enterprise Marketing (QBRs, pipeline reviews, program retros), coordinating with the central Marketing Ops &amp; Strategy team on organisation-wide rhythms, and drive the preparation needed to make these forums decision-useful.</li>\n<li>Lead the identification of high-leverage workflows to automate, partnering with the central GTM AI team on implementation and measuring productivity gains.</li>\n<li>Build and manage the Marketing Operations, Demand Analytics, and MarTech team, setting a high bar for analytical rigor and business partnership.</li>\n<li>Drive cross-functional alignment on shared definitions, tooling, and a single source of truth for marketing performance across the broader Marketing organisation and with Revenue Operations.</li>\n<li>Conduct strategic analyses to inform key organisational decisions, such as resource deployment, coverage ratios, and campaign capacity planning.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>10+ years in marketing operations, analytics, revenue operations, or strategy roles, including at least 3 years leading a team.</li>\n<li>Experience building or significantly scaling a marketing ops/analytics function at a high-growth B2B technology company undergoing significant organisational expansion.</li>\n<li>Deep fluency in the enterprise demand funnel, including lead scoring, MQL/SQL definitions, pipeline attribution, and campaign influence models.</li>\n<li>Hands-on expertise with the modern GTM data stack (CRM, Marketing Automation, BI tools).</li>\n<li>Proven track record of strategic partnership with Finance and Revenue Operations, including experience building budget models and sitting in planning cycles.</li>\n<li>Expertise in running the core operational rhythm of a marketing organisation: QBRs, headcount tracking, budget pacing, and vendor renewals.</li>\n<li>Strong written and verbal communication, capable of translating complex datasets into clear business narratives.</li>\n<li>Genuine curiosity about AI and a willingness to be an early, hands-on adopter of automation tools in your team’s workflows.</li>\n</ul>\n<p><strong>Logistics</strong></p>\n<p>Minimum education: Bachelor’s degree or an equivalent combination of education, training, and/or experience Required field of study: A field relevant to the role as demonstrated through coursework, training, or professional experience Minimum years of experience: Years of experience required will correlate with the internal job level requirements for the position Location-based hybrid policy: Currently, we expect all staff to be in one of our offices at least 25% of the time. Visa sponsorship: We do sponsor visas!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a738803a-64f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.com/","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5169101008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$400,000-$400,000 USD","x-skills-required":["marketing operations","analytics","revenue operations","strategy","demand generation","field events","ABM","EBCs","partner co-marketing","pipeline contribution","marketing programmes","investment","pipeline","revenue","measurement framework","targets","reporting","funnel","top-of-funnel demand","pipeline influence","closed-won attribution","core analytics infrastructure","data models","attribution logic","dashboards","Data Science","marketing","sales","alignment","key metrics","budget tracking","headcount planning","vendor management","marketing leadership","central Marketing Ops & Strategy team","annual planning","quarterly planning","resource allocation","performance reviews","operating cadence","QBRs","pipeline reviews","program retros","organisation-wide rhythms","decision-useful","high-leverage workflows","automation","GTM AI team","implementation","productivity gains","Demand Analytics","MarTech team","analytical rigor","business partnership","cross-functional alignment","shared definitions","tooling","single source of truth","marketing performance","strategic analyses","resource deployment","coverage ratios","campaign capacity planning","lead scoring","MQL/SQL definitions","pipeline attribution","campaign influence models","modern GTM data stack","CRM","Marketing Automation","BI tools","strategic partnership","Finance","budget models","planning cycles","core operational rhythm","headcount tracking","budget pacing","vendor renewals","written communication","verbal communication","complex datasets","business narratives","AI","automation tools"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:52:21.649Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA | Seattle, WA"}},"employmentType":"FULL_TIME","occupationalCategory":"Marketing","industry":"Technology","skills":"marketing operations, analytics, revenue operations, strategy, demand generation, field events, ABM, EBCs, partner co-marketing, pipeline contribution, marketing programmes, investment, pipeline, revenue, measurement framework, targets, reporting, funnel, top-of-funnel demand, pipeline influence, closed-won attribution, core analytics infrastructure, data models, attribution logic, dashboards, Data Science, marketing, sales, alignment, key metrics, budget tracking, headcount planning, vendor management, marketing leadership, central Marketing Ops & Strategy team, annual planning, quarterly planning, resource allocation, performance reviews, operating cadence, QBRs, pipeline reviews, program retros, organisation-wide rhythms, decision-useful, high-leverage workflows, automation, GTM AI team, implementation, productivity gains, Demand Analytics, MarTech team, analytical rigor, business partnership, cross-functional alignment, shared definitions, tooling, single source of truth, marketing performance, strategic analyses, resource deployment, coverage ratios, campaign capacity planning, lead scoring, MQL/SQL definitions, pipeline attribution, campaign influence models, modern GTM data stack, CRM, Marketing Automation, BI tools, strategic partnership, Finance, budget models, planning cycles, core operational rhythm, headcount tracking, budget pacing, vendor renewals, written communication, verbal communication, complex datasets, business narratives, AI, automation tools","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":400000,"maxValue":400000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e1c6866e-f9e"},"title":"Staff Software Engineer, Backend (Cluj)","description":"<p>We are excited to expand our operations to Romania and build a tech hub in the region. As a Staff full-stack engineer, with a backend focus, you will be at the forefront of shaping the future of customer engagement! You&#39;ll be instrumental in delivering timely, actionable insights that drive business growth from day one. We&#39;re building a state-of-the-art Customer Data Platform, visualizing relevant insights for businesses post-onboarding and guiding customer engagement across all touch-points.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, implement, and maintain backend services and APIs to support applications.</li>\n<li>Build and optimize data storage solutions using Postgres, ClickHouse and Elasticsearch to ensure high performance and scalability.</li>\n<li>Collaborate with cross-functional teams, including frontend engineers, data scientists, and machine learning engineers, to deliver end-to-end solutions.</li>\n<li>Monitor and troubleshoot performance issues in distributed systems and databases.</li>\n<li>Write clean, maintainable, and efficient code following best practices for backend development.</li>\n<li>Participate in code reviews, testing, and continuous integration efforts.</li>\n<li>Ensure security, scalability, and reliability of backend services.</li>\n<li>Analyze and improve system architecture, focusing on performance bottlenecks, scaling, and security.</li>\n</ul>\n<p>Qualifications We Value:</p>\n<ul>\n<li>Proven experience as a Backend Engineer with a focus on database design and system architecture.</li>\n<li>Strong expertise in ClickHouse or similar columnar databases for managing large-scale, real-time analytical queries.</li>\n<li>Hands-on experience with Elasticsearch for indexing and searching large datasets.</li>\n<li>Proficient in backend programming languages such as Python, Go.</li>\n<li>Experience with RESTful API design and development.</li>\n<li>Solid understanding of distributed systems, microservices architecture, and cloud infrastructure.</li>\n<li>Experience with performance tuning, data modeling, and query optimization.</li>\n<li>Strong problem-solving skills and attention to detail.</li>\n<li>Excellent communication and teamwork abilities.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e1c6866e-f9e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Cresta","sameAs":"https://www.cresta.ai/","logo":"https://logos.yubhub.co/cresta.ai.png"},"x-apply-url":"https://job-boards.greenhouse.io/cresta/jobs/5102480008","x-work-arrangement":"hybrid","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Postgres","ClickHouse","Elasticsearch","Python","Go","RESTful API design and development","Distributed systems","Microservices architecture","Cloud infrastructure","Performance tuning","Data modeling","Query optimization"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:52:06.437Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Cluj, Romania (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Postgres, ClickHouse, Elasticsearch, Python, Go, RESTful API design and development, Distributed systems, Microservices architecture, Cloud infrastructure, Performance tuning, Data modeling, Query optimization"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_98161ddd-28c"},"title":"Data Analyst III","description":"<p>Why join us</p>\n<p>Brex is a finance platform that enables companies to spend smarter and move faster in over 200 markets. It combines global corporate cards and banking with intuitive spend management, bill pay, and travel software.</p>\n<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>\n<p>We&#39;re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>\n<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Data at Brex</p>\n<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>\n<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>\n<p>What you&#39;ll do</p>\n<p>As a senior Data Analyst (DA III), you will own the end-to-end analytics lifecycle for one or more business areas at Brex.</p>\n<p>You&#39;ll go beyond building dashboards,you&#39;ll frame the right questions, design rigorous analyses, apply statistical methods, and translate your findings into clear recommendations for leadership.</p>\n<p>You will also serve as a technical leader on the Data Analytics team, mentoring more junior analysts and helping define the standards and best practices that elevate the team&#39;s work.</p>\n<p>This role sits at the intersection of analytics, analytics engineering, and business strategy.</p>\n<p>You&#39;ll work in a modern data stack environment and partner closely with Data Scientists, Data Engineers, and senior leaders across the organization.</p>\n<p>Where you&#39;ll work</p>\n<p>This role will be based in our San Francisco office.</p>\n<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>\n<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>\n<p>As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities</p>\n<ul>\n<li>Own the analytics lifecycle for assigned business areas: from problem framing and data sourcing through analysis, insight generation, and stakeholder presentation.</li>\n</ul>\n<ul>\n<li>Build and maintain dashboards and self-service reporting tools that enable business teams to independently track performance, identify risks, and make data-driven decisions.</li>\n</ul>\n<ul>\n<li>Write production-quality SQL and Python code to extract, transform, and analyze data at scale.</li>\n</ul>\n<ul>\n<li>Collaborate with Data Engineers and Data Scientists to develop and maintain analytical data models, improve data pipelines, and ensure data quality across the organization.</li>\n</ul>\n<ul>\n<li>Partner with leadership across Sales, Operations, Product, Finance, and other departments to identify high-impact analytical opportunities and deliver actionable recommendations.</li>\n</ul>\n<ul>\n<li>Mentor other data analysts and contribute to the development of team standards, documentation, code review practices, and analytical frameworks.</li>\n</ul>\n<ul>\n<li>Proactively identify gaps in data infrastructure, propose improvements, and contribute to the evolution of the team’s tooling and processes.</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>5+ years of experience in data analytics, business intelligence, or a related quantitative role.</li>\n</ul>\n<ul>\n<li>3+ years of experience partnering directly with Sales, Operations, Product, or equivalent business teams as an embedded analytics partner.</li>\n</ul>\n<ul>\n<li>Advanced SQL proficiency, including CTEs, window functions, performance optimization, and working across complex data models.</li>\n</ul>\n<ul>\n<li>Proficiency in Python for data analysis, automation, and modeling (Pandas, NumPy, scikit-learn, or similar).</li>\n</ul>\n<ul>\n<li>Experience with cloud data warehouses, particularly Snowflake (BigQuery and Databricks also valued).</li>\n</ul>\n<ul>\n<li>Hands-on experience with BI and data visualization tools (Looker, Tableau, Hex, or similar).</li>\n</ul>\n<ul>\n<li>Strong stakeholder management skills,proven ability to present complex technical findings to non-technical audiences.</li>\n</ul>\n<ul>\n<li>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</li>\n</ul>\n<p>Bonus points</p>\n<ul>\n<li>Demonstrated experience applying statistical methods to business problems (e.g., regression, classification, A/B testing).</li>\n</ul>\n<ul>\n<li>Experience with dbt for data modeling and transformation.</li>\n</ul>\n<ul>\n<li>Experience building and maintaining data pipelines using orchestration tools such as Airflow.</li>\n</ul>\n<ul>\n<li>Experience working with APIs for data ingestion and integration.</li>\n</ul>\n<ul>\n<li>Familiarity with version control systems (Git).</li>\n</ul>\n<ul>\n<li>Experience in fintech, financial services, or payments.</li>\n</ul>\n<ul>\n<li>Track record of leading cross-functional analytics projects from scoping through delivery.</li>\n</ul>\n<p>Compensation</p>\n<p>The expected salary range for this role is $114,192 - $142,740.</p>\n<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>\n<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_98161ddd-28c","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8463699002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$114,192 - $142,740","x-skills-required":["Advanced SQL","Python","Cloud data warehouses","BI and data visualization tools","Stakeholder management","Generative AI and LLM-based tools"],"x-skills-preferred":["Statistical methods","dbt for data modeling and transformation","Orchestration tools","APIs for data ingestion and integration","Version control systems","Fintech, financial services, or payments"],"datePosted":"2026-04-18T15:51:38.936Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"Advanced SQL, Python, Cloud data warehouses, BI and data visualization tools, Stakeholder management, Generative AI and LLM-based tools, Statistical methods, dbt for data modeling and transformation, Orchestration tools, APIs for data ingestion and integration, Version control systems, Fintech, financial services, or payments","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":114192,"maxValue":142740,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9537437b-e23"},"title":"Staff Backend Engineer, Knowledge Graph (Rust)","description":"<p>As a Staff Backend Engineer on the GitLab Knowledge Graph team, you&#39;ll help design, scale, and operate a high-impact graph data service that underpins agents, analytics, and architecture-level features across GitLab.com, Dedicated, and Self-Managed deployments.</p>\n<p>You&#39;ll partner with a small, senior Rust-first team to ship reliable graph capabilities and make them easy for other teams and agents to use. The Knowledge Graph service is a distributed SDLC indexing system. It builds a property graph from GitLab SDLC (software development lifecycle) and code data using ClickHouse, NATS JetStream, and the Data Insights Platform. It also exposes secure graph queries and MCP tools for AI agents and product features.</p>\n<p>In this role, you&#39;ll own core parts of the system end to end: shaping the architecture, hardening multi-tenant behavior and performance, and making it straightforward for other teams and agents to consume graph capabilities. In your first year, you&#39;ll take clear ownership of major areas of the service (for example, the graph query engine, SDLC indexing, or multi-tenant authorization), reduce single points of failure through better runbooks and shared context, and raise the bar on how we design, build, and operate analytical services across the stack.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Leading the design and evolution of core Knowledge Graph services in a production Rust codebase, including the graph query engine, SDLC and code indexing pipelines, and API/MCP surfaces that other GitLab teams and AI agents rely on.</li>\n</ul>\n<ul>\n<li>Owning complex, cross-cutting initiatives that span GitLab Rails, the Data Insights Platform (Siphon, NATS, ClickHouse), and GitLab Duo Agent Platform, from technical direction and design docs through implementation, rollout, and iteration.</li>\n</ul>\n<ul>\n<li>Driving system design decisions that improve reliability, scalability, and maintainability for analytical (OLAP-style) graph workloads. This includes multi-hop traversals, aggregations, and multi-tenant isolation. Document trade-offs so the broader team can move quickly and stay aligned.</li>\n</ul>\n<ul>\n<li>Defining and improving operational maturity for the service, including service level objectives (SLOs), observability, runbooks, incident response, capacity planning, and production readiness (PREP) for GitLab.com, Dedicated, and Self-Managed deployments.</li>\n</ul>\n<ul>\n<li>Collaborating asynchronously with product, data, infrastructure, security, and AI teams to sequence work, unblock platform-level dependencies, and land features in a way that is safe for customers and sustainable for the team.</li>\n</ul>\n<ul>\n<li>Applying AI-assisted development workflows responsibly (for example, using MCP-aware tools, Knowledge Graph-backed agents, and internal Duo tooling) and help establish practical norms for how the team uses AI while maintaining strong engineering judgment.</li>\n</ul>\n<ul>\n<li>Mentoring and supporting other engineers through pairing, technical design reviews, and knowledge-sharing, reinforcing shared ownership of the system and its operational sustainability.</li>\n</ul>\n<ul>\n<li>Contributing across the stack when needed, including occasional Ruby (Rails integration and authorization paths) or frontend work (for example, the Software Architecture Map UI) to close gaps and keep delivery moving.</li>\n</ul>\n<p>This role requires significant experience building and operating production backend systems, with a track record of owning reliability, maintainability, and on-call readiness for services that support other product teams or platforms. Strong engineering skills in Rust or clear evidence you can ramp quickly and deliver in a Rust-first, performance-sensitive backend codebase are essential. Additionally, strong system design skills, including making and explaining clear architectural decisions, documenting constraints, and aligning trade-offs with product and platform needs, are necessary.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9537437b-e23","directApply":true,"hiringOrganization":{"@type":"Organization","name":"GitLab","sameAs":"https://about.gitlab.com/","logo":"https://logos.yubhub.co/about.gitlab.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/gitlab/jobs/8481945002","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Rust","ClickHouse","NATS JetStream","Data Insights Platform","graph data modeling","query patterns","property graphs","Cypher/GQL","n-hop traversals","aggregations","multi-tenant isolation","service level objectives","observability","runbooks","incident response","capacity planning","production readiness","AI-assisted development workflows","MCP-aware tools","Knowledge Graph-backed agents","internal Duo tooling"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:51:38.397Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, India"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Rust, ClickHouse, NATS JetStream, Data Insights Platform, graph data modeling, query patterns, property graphs, Cypher/GQL, n-hop traversals, aggregations, multi-tenant isolation, service level objectives, observability, runbooks, incident response, capacity planning, production readiness, AI-assisted development workflows, MCP-aware tools, Knowledge Graph-backed agents, internal Duo tooling"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_09a4d1ce-cde"},"title":"Data Engineer","description":"<p>We are looking for an experienced Data Engineer to partner with our Data Science and Data Infrastructure teams to own and scale our data pipelines. You&#39;ll also work closely with stakeholders across business teams including sales, marketing, and finance to ensure that the data they need arrives promptly and reliably.</p>\n<p>As a Data Engineer at Figma, you will be responsible for building and maintaining scalable data pipelines that connect various cloud data sources. You will develop a deep understanding of Figma&#39;s core data models and optimize data pipelines for scale. You will partner with the Data Science and Data Infrastructure teams to build new foundational data sets that are trusted, well understood, and enable self-service.</p>\n<p>You will work with a wide range of cross-functional stakeholders to derive requirements and architect shared datasets; ability to document, simplify and explain complex problems to different types of audiences. You will establish best practices for the development of specialized data sets for analytics and modeling.</p>\n<p>We&#39;d love to hear from you if you have:</p>\n<ul>\n<li>4+ years in a relevant field.</li>\n<li>Fluency with both SQL and Python.</li>\n<li>Familiarity with Snowflake, dbt, Dagster, and ETL/reverse ETL tools.</li>\n<li>Excellent judgment and creative problem-solving skills.</li>\n<li>A self-starting mindset along with strong communication and collaboration skills.</li>\n</ul>\n<p>While not required, it&#39;s an added plus if you also have:</p>\n<ul>\n<li>Knowledge in data modeling methodologies to design and build robust data architectures for insightful analytics.</li>\n<li>Experience with business systems such as Salesforce, Customer IO, Stripe, NetSuite is a big plus.</li>\n</ul>\n<p>At Figma, one of our values is Grow as you go. We believe in hiring smart, curious people who are excited to learn and develop their skills. If you&#39;re excited about this role but your past experience doesn&#39;t align perfectly with the points outlined in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_09a4d1ce-cde","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Figma","sameAs":"https://www.figma.com/","logo":"https://logos.yubhub.co/figma.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/figma/jobs/5220003004","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$140,000-$348,000 USD","x-skills-required":["SQL","Python","Snowflake","dbt","Dagster","ETL/reverse ETL tools"],"x-skills-preferred":["data modeling methodologies","business systems such as Salesforce, Customer IO, Stripe, NetSuite"],"datePosted":"2026-04-18T15:51:04.727Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA • New York, NY • United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, Snowflake, dbt, Dagster, ETL/reverse ETL tools, data modeling methodologies, business systems such as Salesforce, Customer IO, Stripe, NetSuite","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":140000,"maxValue":348000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_dbd787ee-1b3"},"title":"Senior Software Engineer - Security Platform Team","description":"<p>As a Senior Software Engineer on the Security Platform Delivery and Success team, you will help design and build workflows that make it easier for customers to understand how AI is helping them save time and money. You&#39;ll collaborate closely with other security engineering teams, as well as with product and design, to deliver resilient, high-scale features used by security practitioners around the world.</p>\n<p>Your responsibilities will include:</p>\n<ul>\n<li>Designing and implementing features that power Value Reports, AI-powered Inboxes, and Guided onboarding for all of the Security products.</li>\n<li>Building and evolving APIs that correlate entities, findings, signals, and configuration data into coherent security stories.</li>\n<li>Developing scalable, high-performance systems within the Elastic ecosystem and cloud-native environments.</li>\n<li>Owning the reliability, observability, and operational health, from design to production.</li>\n<li>Collaborating with product, design, and other engineering teams to refine requirements and deliver impactful, user-centered solutions.</li>\n</ul>\n<p>To succeed in this role, you will need:</p>\n<ul>\n<li>Solid programming experience in JavaScript/TypeScript, with a particular emphasis on React.js and Node.js frameworks.</li>\n<li>Experience designing and implementing APIs, data models, and services that support rich application workflows.</li>\n<li>Ability to take ownership of problems end-to-end: from clarifying requirements and proposing designs to delivering, monitoring, and iterating in production.</li>\n<li>Comfort working in a distributed, async-first environment and collaborating with colleagues across time zones.</li>\n<li>A proactive mindset: you ask the right questions, challenge assumptions, and look for ways to improve both product and engineering processes.</li>\n</ul>\n<p>Bonus points for familiarity with Kibana or Elasticsearch, and contributions to open source.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_dbd787ee-1b3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Elastic, the Search AI Company","sameAs":"https://www.elastic.co/","logo":"https://logos.yubhub.co/elastic.co.png"},"x-apply-url":"https://job-boards.greenhouse.io/elastic/jobs/7310546","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$128,300-$203,000 CAD","x-skills-required":["JavaScript","TypeScript","React.js","Node.js","API design","data modeling","service development"],"x-skills-preferred":["Kibana","Elasticsearch","open source contributions"],"datePosted":"2026-04-18T15:51:00.409Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Canada"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"JavaScript, TypeScript, React.js, Node.js, API design, data modeling, service development, Kibana, Elasticsearch, open source contributions","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":128300,"maxValue":203000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_86696218-8f0"},"title":"Staff Backend Engineer (Ruby on Rails/AI), Verify","description":"<p>As a Staff Backend Engineer (AI) in the Verify stage at GitLab, you&#39;ll help shape and scale the core infrastructure behind GitLab CI. You&#39;ll play a central role in how we integrate AI into CI/CD workflows. Your work will impact performance, reliability, and usability for people running millions of CI jobs, from small teams to the largest enterprises.</p>\n<p>In this role, you&#39;ll go beyond using AI tools and help define how we design, build, and iterate on AI-assisted and agentic CI experiences. You&#39;ll set standards for what good looks like across our AI agent portfolio, including how we measure success, how we instrument behavior in production, and how we account for large language model limitations. You&#39;ll also help responsibly integrate GitLab&#39;s Duo Agent Platform into CI workflows at scale, on a foundation that&#39;s fast, reliable, secure, and observable.</p>\n<p>We have ambitious goals for Agentic CI in FY27. As a Staff Engineer, you will:</p>\n<ul>\n<li>Partner with Engineering, Product, and UX leadership to pressure-test our priorities: where we can move faster, where we&#39;re missing data, and where there&#39;s whitespace to innovate. Part of this includes learning and growing with the Engineering team you will collaborate closely with.</li>\n</ul>\n<ul>\n<li>Define what success looks like across our agent portfolio and make sure we&#39;re tracking against it , not just shipping, but learning.</li>\n</ul>\n<ul>\n<li>Bring a sharp eye to the competitive landscape, helping us understand what it takes to keep GitLab CI best-in-class in an increasingly agentic world.</li>\n</ul>\n<p>Examples of Agentic CI work we have planned for the upcoming year:</p>\n<ul>\n<li>AI Pipeline Builder, the foundational CI agent that auto-creates pipelines for new projects and serves as the launchpad for onboarding new CI users.</li>\n</ul>\n<ul>\n<li>Automate the Fix a Failing Pipeline flow at scale – from dogfooding on internal GitLab projects through to safe, controlled rollout for customers, solving real infrastructure and scalability challenges.</li>\n</ul>\n<ul>\n<li>Build the instrumentation and observability layer that makes agentic CI trustworthy , trigger volume dashboards, retry rates, cost safeguards , so we can measure what&#39;s working, catch what isn&#39;t, and iterate with confidence.</li>\n</ul>\n<ul>\n<li>Harden the CI pipeline execution infrastructure that these agents depend on: database access patterns, background processing, and job orchestration built to handle the additional load that AI-driven automation introduces at enterprise scale.</li>\n</ul>\n<p>You&#39;ll shape and scale GitLab CI backend infrastructure to improve performance, reliability, and usability for users running jobs at high volume. You&#39;ll design and implement AI-powered features for Agentic CI, including agents, agentic flows, and LLM-backed tooling that integrates with GitLab&#39;s Duo Agent Platform. You&#39;ll define what success looks like for AI in CI before you build, including baselines, measurable outcomes, and clear signals that help the team learn and iterate. You&#39;ll build the instrumentation and observability needed to make AI-assisted CI trustworthy in production, including feature behavior metrics, dashboards, and safeguards. You&#39;ll own and drive measurable performance improvements across CI systems (for example, database access patterns, background processing, and job orchestration) by forming hypotheses, running experiments, and validating results with data. You&#39;ll write secure, well-tested, maintainable Ruby on Rails code in a large monolith, improving existing features while reducing technical debt and operational risk. You&#39;ll lead cross-functional technical work with Product, UX, and Infrastructure, influencing architecture and execution across the Verify stage. You&#39;ll share standards, patterns, and learnings with other engineers, raising the bar for responsible AI integration and evidence-driven engineering across CI.</p>\n<p>This role requires advanced proficiency with Ruby and Ruby on Rails, with experience building and maintaining reliable backend services in a large codebase. You should have strong PostgreSQL skills, including data modeling, query tuning, and scaling large tables through proactive performance investigation and remediation. You should have hands-on experience building, running, and debugging high-traffic production systems, ideally in CI, workflow orchestration, or adjacent infrastructure-heavy domains. You should have practical experience designing and shipping AI-powered backend features and integrations, including sound judgment about large language model limitations and responsible use in production. You should have a data-driven approach to engineering: defining hypotheses, establishing baseline metrics, instrumenting changes, and measuring outcomes against clear success criteria. You should have familiarity with observability patterns and tools (metrics, logging, tracing) to diagnose issues, improve reliability, and guide iteration. You should have strong backend architecture and delivery practices, including secure design, well-tested code, and strategies for safe rollouts and zero-downtime changes. You should have clear written and verbal communication skills, including writing technical proposals and documentation, and collaborating effectively in a remote, asynchronous, cross-functional environment.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_86696218-8f0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"GitLab","sameAs":"https://about.gitlab.com/","logo":"https://logos.yubhub.co/about.gitlab.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/gitlab/jobs/8448283002","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Ruby","Ruby on Rails","PostgreSQL","Data modeling","Query tuning","Scaling large tables","High-traffic production systems","CI","Workflow orchestration","Infrastructure-heavy domains","AI-powered backend features","Large language model limitations","Responsible use in production","Data-driven approach to engineering","Observability patterns","Metrics","Logging","Tracing","Backend architecture","Delivery practices","Secure design","Well-tested code","Safe rollouts","Zero-downtime changes"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:50:58.310Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, APAC; Remote, Canada; Remote, Ireland; Remote, Netherlands; Remote, United Kingdom; Remote, US; Remote, US-Southeast"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Ruby, Ruby on Rails, PostgreSQL, Data modeling, Query tuning, Scaling large tables, High-traffic production systems, CI, Workflow orchestration, Infrastructure-heavy domains, AI-powered backend features, Large language model limitations, Responsible use in production, Data-driven approach to engineering, Observability patterns, Metrics, Logging, Tracing, Backend architecture, Delivery practices, Secure design, Well-tested code, Safe rollouts, Zero-downtime changes"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1be89b3c-bc1"},"title":"Staff Analytics Engineer","description":"<p>We are currently hiring for multiple teams:</p>\n<p>Foundational Data team: Our mission in the Foundational Data team is to build and maintain high-quality datasets frequently used across all of Airbnb. We set company-wide standards that decide how locations are grouped into regions, visitors are measured based upon site traffic, bot traffic is separated from organic traffic, and cloud costs are attributed to Airbnb services. This data is used to build public financial reports, drive strategic marketing decisions, and manage operational costs.</p>\n<p>AirCover Data Foundation: The AirCover Data Foundation team is responsible for providing trustworthy, consistent data and metrics to facilitate business insights, informed decision-making, and seamless operations across Airbnb&#39;s AirCover programs, such as Guest Travel Insurance, AirCover for Hosts, and AirCover for Guests.</p>\n<p>As a Staff Analytics Engineer, you will bring a unique lens to our data strategy and provide in-depth technical mentorship and leadership to the team. We are looking for someone with expertise in data modeling, metric development, and large-scale distributed data processing frameworks like Presto or Spark.</p>\n<p>Leveraging our internal, top-tier data tooling alongside other resources, you will empower both technical and non-technical teams across Airbnb to utilize our data for making decisions grounded in evidence. Staff-level engineers are expected to do this with a minimal amount of supervision. We value innovative thinkers who consistently seek smarter and more efficient solutions while managing daily operations, deadlines, and collaborating with team members.</p>\n<p>A Typical Day:</p>\n<ul>\n<li>Develop high-quality data assets to satisfy a wide range of use-cases</li>\n<li>Develop frameworks and tools to scale insight generation to meet critical business and infrastructure requirements</li>\n<li>Collaborate and build strong partnerships with other data practitioners throughout Airbnb</li>\n<li>Influence the trajectory of data in decision making</li>\n<li>Improve trust in our data by championing for data quality across the stack</li>\n</ul>\n<p>Your Expertise:</p>\n<ul>\n<li>9+ years of experience with a BS/Masters or 6+ years with a PhD</li>\n<li>Fluent in SQL and proficient in at least one data engineering language, such as Python or Scala</li>\n<li>Expertise using business intelligence and reporting tools like Superset and Tableau</li>\n<li>Expertise in large-scale distributed data processing frameworks like Presto or Spark</li>\n<li>Expertise in data modeling for data warehouses and/or metrics repositories</li>\n<li>Experience with an ETL framework like Airflow</li>\n<li>Clear and mature communication skills: ability to distill complex ideas for technical and non-technical stakeholders</li>\n<li>Ability to provide technical leadership and mentorship, guiding teams on best practices and contributing to the development of analytic engineering strategies</li>\n<li>Experience exploring and leveraging LLM AI’s in everyday tasks (coding, documentation, etc…)</li>\n<li>Strong capability to forge trusted partnerships across working teams</li>\n</ul>\n<p>Nice to have:</p>\n<ul>\n<li>Scaling data tasks via automation</li>\n<li>Previous experience in large-scale cloud-based software engineering or system architecture</li>\n<li>Experience with AB experimentation</li>\n<li>Familiarity with AI/ML algorithms, including their dependencies on data, as well as their respective strengths and limitations</li>\n<li>Designing and/or leveraging high-quality data visualization tools</li>\n</ul>\n<p>Your Location: This position is US - Remote Eligible. The role may include occasional work at an Airbnb office or attendance at offsites, as agreed to with your manager. While the position is Remote Eligible, you must live in a state where Airbnb, Inc. has a registered entity. Click here for the up-to-date list of excluded states.</p>\n<p>Our Commitment To Inclusion &amp; Belonging: Airbnb is committed to working with the broadest talent pool possible. We believe diverse ideas foster innovation and engagement, and allow us to attract creatively-led people, and to develop the best products, services and solutions. All qualified individuals are encouraged to apply. We strive to also provide a disability inclusive application and interview process. If you are a candidate with a disability and require reasonable accommodation in order to submit an application, please contact us at: reasonableaccommodations@airbnb.com.</p>\n<p>How We&#39;ll Take Care of You: Our job titles may span more than one career level. The actual base pay is dependent upon many factors, such as: training, transferable skills, work experience, business needs and market demands. The base pay range is subject to change and may be modified in the future. This role may also be eligible for bonus, equity, benefits, and Employee Travel Credits. Pay Range $194,000-$240,000 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1be89b3c-bc1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Airbnb","sameAs":"https://www.airbnb.com/","logo":"https://logos.yubhub.co/airbnb.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/airbnb/jobs/7733495","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$194,000-$240,000 USD","x-skills-required":["SQL","Python","Scala","Presto","Spark","Superset","Tableau","ETL","Airflow","Data Modeling","Data Warehousing","Metrics Repositories"],"x-skills-preferred":["LLM AI","AI/ML Algorithms","Data Visualization","Cloud-Based Software Engineering","System Architecture","AB Experimentation"],"datePosted":"2026-04-18T15:50:40.547Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, Scala, Presto, Spark, Superset, Tableau, ETL, Airflow, Data Modeling, Data Warehousing, Metrics Repositories, LLM AI, AI/ML Algorithms, Data Visualization, Cloud-Based Software Engineering, System Architecture, AB Experimentation","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":194000,"maxValue":240000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ae05b1e7-61b"},"title":"Salesforce Administrator II","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>As a Salesforce Administrator II at Brex, you will work as a core member of the Salesforce Engineering team. You will help configure, maintain, and scale the Salesforce ecosystem that powers our Sales, Support, and Operations teams. We’re looking for someone eager to build scalable solutions, automate complex processes, and maintain a high bar for data quality and system health.</p>\n<p>In this role, you will move beyond simple ticket resolution to own features end-to-end. You will collaborate directly with engineers and business stakeholders to translate requirements into technical reality. You’ll be encouraged to think like an engineer,focusing on reliability, scalability, and clean configuration,to help Brex prepare for the new markets we are about to enter.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our São Paulo office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require 3 days per week in the office - Monday, Wednesday, and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities</p>\n<ul>\n<li>Collaborate with cross-functional teams (Sales, Operations, Finance) to configure and implement new Salesforce features and enhancements.</li>\n<li>Manage day-to-day administration, including user management, roles, profiles, sharing rules, and security controls.</li>\n<li>Own configuration tasks end-to-end, from understanding the user need to testing and deployment.</li>\n<li>Build and maintain complex Flows and automation logic, ensuring they are performant and scalable.</li>\n<li>Participate in the software development lifecycle (SDLC) by managing deployments, validating metadata changes, and maintaining documentation.</li>\n<li>Tune and polish existing configurations to specific engineering standards and assist in technical debt reduction.</li>\n<li>Triaging and resolving bugs and support requests to maintain high system reliability.</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>2+ years of professional experience as a Salesforce Administrator.</li>\n<li>Experience with Salesforce Sales Cloud and Service Cloud.</li>\n<li>Strong proficiency with Salesforce declarative development (Flows, Object Manager, Validation Rules) and security models.</li>\n<li>Experience deploying changes through Salesforce DevOps pipelines.</li>\n<li>A high bar for configuration standards, best practices, naming conventions, and documentation.</li>\n<li>Ability to communicate technical concepts clearly to non-technical stakeholders.</li>\n<li>Strong understanding of SOQL and Salesforce&#39;s data model.</li>\n<li>Solid grasp of Salesforce&#39;s Governor Limits.</li>\n</ul>\n<p>Bonus points</p>\n<ul>\n<li>Experience with declarative integrations between Salesforce and third-party applications.</li>\n<li>Familiarity with Agile methodologies.</li>\n<li>Experience with DevOps tools (e.g., Gearset, Copado) or version control (Git).</li>\n<li>Basic understanding of Apex (ability to read/debug code) and programmatic solutions available.</li>\n<li>Experience working within a high-growth technology or financial services company.</li>\n<li>Salesforce certifications: Administrator, Platform App Builder, Sales Cloud Consultant, Service Cloud Consultant, etc.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ae05b1e7-61b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8398611002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Salesforce","Sales Cloud","Service Cloud","Flows","Object Manager","Validation Rules","Security Models","DevOps Pipelines","SOQL","Salesforce Data Model","Governor Limits"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:50:33.799Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"São Paulo, São Paulo, Brazil"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Salesforce, Sales Cloud, Service Cloud, Flows, Object Manager, Validation Rules, Security Models, DevOps Pipelines, SOQL, Salesforce Data Model, Governor Limits"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ae53b8d4-8fd"},"title":"Sr. AI Engineer, Application Engineering","description":"<p>We&#39;re hiring a Sr. AI Engineer to join our IT department to manage AI agentic deployments and deliver real impact for our customers. As a Sr. AI Engineer, you will design and develop tailored solutions using the Elastic Agent Builder platform and related technologies, guide technical engagements, and support the growth of junior engineers.</p>\n<p>Your responsibilities will include:</p>\n<ul>\n<li>Technical Delivery and Implementation: Own the end-to-end technical delivery of AI solutions, including writing code, configuring systems, and resolving issues, while reviewing the work of junior team members to ensure quality deployment and measurable business impact.</li>\n</ul>\n<ul>\n<li>AI Solution Development: Take ownership of designing and implementing scalable production systems, including AI and Large Language Model (LLM) based intelligent agents and automated workflows built on the Salesforce platform.</li>\n</ul>\n<ul>\n<li>Custom Agentic AI Engineering: Work directly with stakeholders to design and build custom intelligent agents using the Elastic Agent Builder platform, ensuring solutions meet unique business requirements and integrate smoothly with existing tool ecosystems.</li>\n</ul>\n<ul>\n<li>Data Configuration and Integration: Own the full data lifecycle, from data model design to building efficient processing pipelines and establishing integration strategies. Ensure data is optimized and secure for AI applications, including in complex enterprise environments.</li>\n</ul>\n<ul>\n<li>Technical Problem Solving: Identify, analyze, and resolve technical challenges across all phases of solution delivery, from data integration to model deployment and agent orchestration. Serve as a reliable resource for unblocking progress.</li>\n</ul>\n<ul>\n<li>Agentic Innovation: Develop expertise in the Elastic platform, pushing its capabilities forward. Lead the development of custom intelligent agents, automate business processes, and shape user experiences. Insights from the field will directly influence product enhancements and platform direction.</li>\n</ul>\n<ul>\n<li>Client Partnership: Embed with client teams to understand their operational challenges and goals. Translate requirements into clear technical designs, build strong relationships, and serve as a trusted technical advisor.</li>\n</ul>\n<ul>\n<li>Debugging and Root Cause Analysis: Perform thorough analysis, debugging, and root cause identification for complex system interactions, data flows, and AI model behaviors to optimize performance and prevent recurring issues.</li>\n</ul>\n<ul>\n<li>Prototyping and Iteration: Rapidly develop proofs-of-concept and minimum viable products, often coding alongside client teams to demonstrate capabilities and gather feedback for iterative refinement.</li>\n</ul>\n<ul>\n<li>Engineering Best Practices: Apply and promote standards for code quality, scalability, security, and maintainability across all deployed solutions.</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>At least 5 years&#39; experience in a hands-on, end-to-end delivery role for scalable production solutions in a professional environment</li>\n</ul>\n<ul>\n<li>Expert-level proficiency in one or more programming languages (e.g., JavaScript, Java, Python)</li>\n</ul>\n<ul>\n<li>Extensive experience building and deploying solutions with AI/LLM technologies, including integrating LLMs, applying AI orchestration frameworks (e.g., LangChain, LlamaIndex), prompt engineering techniques, and agentic frameworks</li>\n</ul>\n<ul>\n<li>Deep expertise in data modeling, processing, integration, and analytics, with proficiency in enterprise data platforms (e.g., Salesforce Data Cloud, Snowflake, Databricks, BigQuery)</li>\n</ul>\n<ul>\n<li>Strong collaboration, communication, and presentation skills, both written and verbal, with the ability to explain complex technical concepts to technical and non-technical partners</li>\n</ul>\n<ul>\n<li>Track record of leading technical engagements, mentoring junior team members, and taking responsibility for technical aspects of projects</li>\n</ul>\n<p>This role is eligible to participate in Elastic&#39;s stock program and has a competitive salary range of $94,300-$149,200 USD, with an alternate range of $113,300-$179,200 USD in select locations.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ae53b8d4-8fd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Elastic","sameAs":"https://www.elastic.co/","logo":"https://logos.yubhub.co/elastic.co.png"},"x-apply-url":"https://job-boards.greenhouse.io/elastic/jobs/7722032","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$94,300-$149,200 USD","x-skills-required":["JavaScript","Java","Python","AI/LLM technologies","LangChain","LlamaIndex","prompt engineering techniques","agentic frameworks","data modeling","processing","integration","analytics","Salesforce Data Cloud","Snowflake","Databricks","BigQuery"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:49:59.873Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"JavaScript, Java, Python, AI/LLM technologies, LangChain, LlamaIndex, prompt engineering techniques, agentic frameworks, data modeling, processing, integration, analytics, Salesforce Data Cloud, Snowflake, Databricks, BigQuery","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":94300,"maxValue":149200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_760c3e88-e35"},"title":"Senior Product Manager, Data","description":"<p>Job Title: Senior Product Manager, Data</p>\n<p>We are seeking a Senior Product Manager to support the development of CoreWeave&#39;s Enterprise Data Platform within the CIO organization. This role will contribute to building a scalable, high-performance data lake and data architecture, integrating data from key sources across Operations, Engineering, Sales, Finance, and other IT partners.</p>\n<p>As a Senior Product Manager for Data Infrastructure and Analytics, you will help drive data ingestion, transformation, governance, and analytics enablement. You will collaborate with engineering, analytics, finance, and business teams to help deliver data lake and pipeline orchestration solutions, ensuring accessible data for business insights.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Own and evangelize Data Platform and Business Analytics roadmap and strategy across CoreWeave</li>\n<li>Assist with the execution of CoreWeave&#39;s enterprise data architecture, helping enable the data lake and domain-driven data layer</li>\n<li>Support the development and enhancement of data ingestion, transformation, and orchestration pipelines for scalability, efficiency, and reliability</li>\n<li>Work with the Engineering and Data teams to maintain and enhance data pipelines for both structured and unstructured data, enabling efficient data movement across the organization</li>\n<li>Collaborate with Finance, GTM, Infrastructure, Data Center, and Supply Chain teams to help unify and model data from core systems (ERP, CRM, Asset Mgmt, Supply Chain systems, etc.)</li>\n<li>Contribute to data governance and quality initiatives, focusing on data consistency, lineage tracking, and compliance with security standards</li>\n<li>Support the BI and analytics layer by partnering with stakeholders to enable data products, dashboards, and reporting capabilities</li>\n<li>Help prioritize data-driven initiatives, ensuring alignment with business goals and operational needs in coordination with leadership</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>5+ years of experience in data product management, data architecture, or enterprise data engineering roles</li>\n<li>Familiarity with data lakes, data warehouses, ETL/ELT and streaming pipelines, and data governance frameworks</li>\n<li>Hands-on experience with modern data stack technologies (such as Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka)</li>\n<li>Understanding of data modeling, domain-driven design, and creating scalable data platforms</li>\n<li>Experience supporting the end-to-end data product lifecycle, including requirements gathering and implementation</li>\n<li>Strong collaboration skills with engineering, analytics, and business teams to help deliver data initiatives</li>\n<li>Awareness of data security, compliance, and governance best practices</li>\n<li>Understanding of BI and analytics platforms (such as Tableau, Looker, Power BI) and supporting self-service analytics</li>\n</ul>\n<p>Why CoreWeave?</p>\n<p>At CoreWeave, we work hard, have fun, and move fast! We&#39;re in an exciting stage of hyper-growth that you will not want to miss out on. We&#39;re not afraid of a little chaos, and we&#39;re constantly learning. Our team cares deeply about how we build our product and how we work together, which is represented through our core values:</p>\n<ul>\n<li>Be Curious at Your Core</li>\n<li>Act Like an Owner</li>\n<li>Empower Employees</li>\n<li>Deliver Best-in-Class Client Experiences</li>\n<li>Achieve More Together</li>\n</ul>\n<p>We support and encourage an entrepreneurial outlook and independent thinking. We foster an environment that encourages collaboration and provides the opportunity to develop innovative solutions to complex problems. As we get set for take off, the growth opportunities within the organization are constantly expanding. You will be surrounded by some of the best talent in the industry, who will want to learn from you, too. Come join us!</p>\n<p>Salary Range: $143,000 to $210,000</p>\n<p>Benefits:</p>\n<ul>\n<li>Medical, dental, and vision insurance - 100% paid for by CoreWeave</li>\n<li>Company-paid Life Insurance</li>\n<li>Voluntary supplemental life insurance</li>\n<li>Short and long-term disability insurance</li>\n<li>Flexible Spending Account</li>\n<li>Health Savings Account</li>\n<li>Tuition Reimbursement</li>\n<li>Ability to Participate in Employee Stock Purchase Program (ESPP)</li>\n<li>Mental Wellness Benefits through Spring Health</li>\n<li>Family-Forming support provided by Carrot</li>\n<li>Paid Parental Leave</li>\n<li>Flexible, full-service childcare support with Kinside</li>\n<li>401(k) with a generous employer match</li>\n<li>Flexible PTO</li>\n<li>Catered lunch each day in our office and data center locations</li>\n<li>A casual work environment</li>\n<li>A work culture focused on innovative disruption</li>\n</ul>\n<p>Workplace:</p>\n<p>While we prioritize a hybrid work environment, remote work may be considered for candidates located more than 30 miles from an office, based on role requirements for specialized skill sets. New hires will be invited to attend onboarding at one of our hubs within their first month. Teams also gather quarterly to support collaboration.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_760c3e88-e35","directApply":true,"hiringOrganization":{"@type":"Organization","name":"CoreWeave","sameAs":"https://www.coreweave.com","logo":"https://logos.yubhub.co/coreweave.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coreweave/jobs/4649824006","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$143,000 to $210,000","x-skills-required":["data product management","data architecture","enterprise data engineering","data lakes","data warehouses","ETL/ELT and streaming pipelines","data governance frameworks","modern data stack technologies","Snowflake","BigQuery","Databricks","Apache Spark","Airflow","DBT","Kafka","data modeling","domain-driven design","scalable data platforms","BI and analytics platforms","Tableau","Looker","Power BI"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:48:58.405Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Livingston, NJ / New York, NY / Sunnyvale, CA / Bellevue, WA/San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data product management, data architecture, enterprise data engineering, data lakes, data warehouses, ETL/ELT and streaming pipelines, data governance frameworks, modern data stack technologies, Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka, data modeling, domain-driven design, scalable data platforms, BI and analytics platforms, Tableau, Looker, Power BI","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":143000,"maxValue":210000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3168d7d3-70b"},"title":"Partner Solutions Architect - North America","description":"<p>About Us</p>\n<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across North America. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>As a Partner Solutions Architect, you will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud. Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Partner closely with North America Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>Benefits</p>\n<ul>\n<li>Unlimited vacation (and yes we use it!)</li>\n<li>Pension coverage</li>\n<li>Excellent healthcare</li>\n<li>Paid Parental Leave</li>\n<li>Wellness stipend</li>\n<li>Home office stipend, and more!</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3168d7d3-70b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673630005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner development","field engineering","sales engineering","consulting","partner engineering"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:48:30.813Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Canada - Remote; US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner development, field engineering, sales engineering, consulting, partner engineering, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0b431957-4b8"},"title":"Planning Analyst","description":"<p>The Planning organisation plays a critical role in driving operational outcomes by translating demand insights into precise production and procurement signals. Within this group, the Systems &amp; Process team partners closely with Anduril&#39;s Analytics team to build scalable workflows, ontologies, and data models that power planning excellence.</p>\n<p>We&#39;re looking for a Planning Analyst who thrives at the intersection of digital and the physical hardware world. You&#39;ll design and implement data models, workflows, and analytical tools that directly improve operational performance,enabling Anduril to plan more effectively at scale.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Partner with Planning leadership and Anduril&#39;s Analytics team to build secure, timely, and extensible ontologies for domains such as Supply Chain, Manufacturing, and Finance.</li>\n<li>Develop reporting and data applications that provide near real-time visibility into planning KPIs, including forecast accuracy, fill rates, and inventory turns.</li>\n<li>Conduct ad-hoc analyses on demand signals, inventory strategies, and supply risk, delivering insights that balance service levels with profitability.</li>\n<li>Run scenario planning and sensitivity analyses to evaluate the impact of market or contract volatility on operational outcomes.</li>\n<li>Collaborate with Analytics engineers to generalise planning workflows and data products across Anduril&#39;s production and sustainment organisations.</li>\n<li>Act as a subject matter expert for planning tools and systems, supporting upgrades, enhancements, and best practice adoption.</li>\n<li>Become a trusted resource for leadership by helping them run their organisations more effectively through data-driven insights.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>3+ years of experience in an analytics-focused role (Analytics Engineer, Data Engineer, or Analyst/Consultant with strong supply chain or operations background).</li>\n<li>Strong operational instincts and a track record of owning complex, ambiguous planning problems from start to finish.</li>\n<li>Ability to think strategically and execute tactically.</li>\n<li>Data-driven mindset and ability to turn analytics into action.</li>\n<li>Systems fluency, with experience working across ERP platforms (e.g., NetSuite, SAP, Oracle).</li>\n<li>Cross-functional empathy and ability to understand the needs of engineering, product, deployment, and finance.</li>\n<li>Comfortable with ambiguity and ability to build structure in environments that are scaling fast.</li>\n<li>High ownership, low ego, and value results over credit.</li>\n<li>Ability to anticipate friction before it happens and proactively work to prevent issues rather than react to them.</li>\n<li>Demonstrated experience building and owning processes and ontologies in a fast-paced environment.</li>\n<li>Expert-level SQL skills and proficiency in Python (or other programming languages).</li>\n<li>Experience with BI and analytics tools (Looker, Tableau, Power BI, Palantir Foundry, dbt, Redshift, etc.).</li>\n<li>Strong ability to translate technical models into actionable insights for non-technical stakeholders.</li>\n<li>Self-starter mindset and ability to prioritise velocity and impact.</li>\n<li>Eligible to obtain and maintain an active U.S. Secret security clearance.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0b431957-4b8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/4657698007","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$98,000-$130,000 USD","x-skills-required":["SQL","Python","ERP platforms (e.g., NetSuite, SAP, Oracle)","BI and analytics tools (Looker, Tableau, Power BI, Palantir Foundry, dbt, Redshift, etc.)","Data modelling","Workflow design","Analytical tools","Supply chain management","Manufacturing","Finance"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:48:18.017Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, ERP platforms (e.g., NetSuite, SAP, Oracle), BI and analytics tools (Looker, Tableau, Power BI, Palantir Foundry, dbt, Redshift, etc.), Data modelling, Workflow design, Analytical tools, Supply chain management, Manufacturing, Finance","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":98000,"maxValue":130000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_14d46b7f-188"},"title":"Senior Analyst, Field Analytics","description":"<p>We are looking for a Senior Analyst, Field Analytics to join our Go-To-Market Strategy &amp; Operations group. As part of this team, you will drive insight and scale within the global field organisation by building high-impact technical assets, ranging from executive Tableau dashboards to standardised Snowflake datasets.</p>\n<p>Your responsibilities will include designing, building, and maintaining high-visibility Tableau dashboards and reporting assets that provide actionable insights to business partners across the global organisation. You will also build and optimise production-grade data sets in Snowflake, ensuring that all field data (Pipeline, Bookings, Productivity) is clean, structured, and easily accessible for self-service analysis.</p>\n<p>In addition, you will take ownership of the technical documentation for all GTM reporting assets, ensuring data lineage, metric definitions, and logic are clearly defined and accessible. You will also champion the use of Generative AI tools to accelerate the analytics lifecycle, including automating SQL query generation, streamlining data preparation, and enhancing report documentation.</p>\n<p>To succeed in this role, you will need to have 5+ years of professional experience in Data Analytics or Business Intelligence, ideally within a global delivery model. You will also need to have technical stack expertise in SQL (Snowflake), BI visualisation tool (Tableau), and CRM data (Salesforce).</p>\n<p>Preferred qualifications include proficiency with scripting (python) &amp; data modelling (dbt), deep understanding of enterprise software sales processes, field operations, and cross-functional GTM mechanics.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_14d46b7f-188","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7728562","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL (Snowflake)","BI visualisation tool (Tableau)","CRM data (Salesforce)","Generative AI tools","Data Analytics","Business Intelligence"],"x-skills-preferred":["Scripting (python)","Data modelling (dbt)","Enterprise software sales processes","Field operations","Cross-functional GTM mechanics"],"datePosted":"2026-04-18T15:48:12.823Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL (Snowflake), BI visualisation tool (Tableau), CRM data (Salesforce), Generative AI tools, Data Analytics, Business Intelligence, Scripting (python), Data modelling (dbt), Enterprise software sales processes, Field operations, Cross-functional GTM mechanics"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8eda4ab8-41d"},"title":"Staff  Backend Engineer - Digital Credentials","description":"<p>We are looking for a Staff Backend Engineer to join our Digital Credentials team. As a Staff Backend Engineer, you will define the technical direction of a new VDC identity ecosystem and set the standard for how we build it.</p>\n<p>This role demands deep technical expertise in backend architecture and the ability to influence decisions across the team and adjacent engineering groups. You will thrive in a fast-paced, zero-to-one environment, cutting through ambiguity to drive product-market fit.</p>\n<p>In practice, that means using AI tools for architectural reasoning and design review, generating and validating test coverage, accelerating code review cycles, and helping engineers build judgment about when AI output is trustworthy and when it isn&#39;t.</p>\n<p>You&#39;ll define what &#39;good&#39; looks like here and bring the team along. You&#39;ll also be a force multiplier - mentoring senior engineers, raising the technical bar through architecture.</p>\n<p>Proactively shape the technical strategy and architecture of a greenfield VDC identity ecosystem, partnering with principal engineers to ensure decisions prioritize open standards and interoperability.</p>\n<p>Own architectural decisions in a fast-moving environment - balancing rapid delivery with long-term scalability, security, and platform integrity.</p>\n<p>Identify and mitigate technical risks before they become blockers; escalate dependencies early and widely.</p>\n<p>Own production health for your systems - incident response, root cause identification, and driving operational improvement projects (monitoring coverage, alert reduction, runbook quality).</p>\n<p>Champion high engineering standards across the team, including rigorous code reviews, comprehensive unit/functional testing, and architectural review processes to ensure the platform is tamper-proof.</p>\n<p>Drive continual evolution in the adoption of AI-native development practices, helping the team rethink how we design, review, and ship software with AI as a core part of the workflow.</p>\n<p>Proactively identify and prioritize tech debt; drive improvements in areas beyond feature work - CI/CD, observability, documentation, reliability, and support processes.</p>\n<p>Partner with Product and Design to understand and own the desired customer outcome, not just the technical deliverable.</p>\n<p>Actively invest in the growth of peer engineers, taking responsibility for raising the technical bar of the team as a whole.</p>\n<p>Operates well beyond basic AI tool usage - understands how to apply AI meaningfully across the full software development lifecycle, can evaluate and adopt new AI tooling with good judgment, and is able to establish practices and bring a team along.</p>\n<p>Proven experience defining scalable backend architectures and integrating complex APIs with modern web technologies - including the ability to make and defend architectural decisions with lasting org-wide impact.</p>\n<p>Solid understanding of web authentication and authorization fundamentals - how auth flows work, token-based access (JWT, OAuth), and best practices for securing sensitive user data.</p>\n<p>Deep expertise in building highly reliable, secure, enterprise-grade software using Java or Node.js.</p>\n<p>Proficiency in JavaScript or TypeScript, with the ability to navigate and contribute to the frontend to unblock feature development.</p>\n<p>Strong understanding of cloud architecture principles, with experience designing and operating large-scale services on AWS.</p>\n<p>Hands-on experience with identity protocols such as OIDC4VP, OIDC4VCI, OAuth, and the W3C Verifiable Credentials data model.</p>\n<p>Prior experience leading a team of engineers through a meaningful shift in ways of working - whether around AI adoption, developer tooling, or engineering culture change.</p>\n<p>Practical knowledge of cryptography: public/private key infrastructure (PKI), digital signatures, and hashing, particularly in the context of Verifiable Credentials or DIDs.</p>\n<p>Experience building or maintaining publicly released SDKs used by external developers.</p>\n<p>Experience in one of the following areas is preferred: developing Identity and Access Management (IAM) tooling, working with blockchain technology, or contributing to a &#39;zero-to-one&#39; startup-like team.</p>\n<p>The annual base salary range for this position for candidates located in Canada is between $160,000-$220,000 CAD.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8eda4ab8-41d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7542109","x-work-arrangement":"hybrid","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$160,000-$220,000 CAD","x-skills-required":["backend architecture","AI tools","architectural reasoning","design review","test coverage","code review cycles","web authentication","authorization","Java","Node.js","JavaScript","TypeScript","cloud architecture","AWS","identity protocols","OIDC4VP","OIDC4VCI","OAuth","W3C Verifiable Credentials data model"],"x-skills-preferred":["leading a team of engineers","developer tooling","engineering culture change","cryptography","public/private key infrastructure","digital signatures","hashing","Verifiable Credentials","DIDs","SDKs","Identity and Access Management","blockchain technology"],"datePosted":"2026-04-18T15:48:05.192Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Toronto, Ontario, Canada"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"backend architecture, AI tools, architectural reasoning, design review, test coverage, code review cycles, web authentication, authorization, Java, Node.js, JavaScript, TypeScript, cloud architecture, AWS, identity protocols, OIDC4VP, OIDC4VCI, OAuth, W3C Verifiable Credentials data model, leading a team of engineers, developer tooling, engineering culture change, cryptography, public/private key infrastructure, digital signatures, hashing, Verifiable Credentials, DIDs, SDKs, Identity and Access Management, blockchain technology","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":160000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3ff860ce-94c"},"title":"Staff, Advanced Analytics, CS Safety","description":"<p>We are looking for a Staff Advanced Analyst to help Airbnb enable travel for our millions of guests and hosts on our platform. This role will sit under the Advanced Analytics family and support Product and Business leaders within our CS Safety organisation.</p>\n<p>As a Staff Advanced Analyst, you will be a data thought partner to product and business leaders across teams through providing insights, recommendations, and enabling data-informed decisions. You will drive day-to-day analytics and create scalable data tools, identify pain points in travelling and hosting, and work with product leadership to improve experiences for our guest, host, and agent community.</p>\n<p>In addition, you will leverage Airbnb&#39;s rich and unique data, state-of-the-art machine learning infrastructure, and other central data science tools to build and grow the measurement capacity within the organisation. You will also be deeply involved in the technical details of the various systems we build, and will have the opportunity to collaborate with a strong team of engineers, product managers, designers, and operations agents to achieve shared, cross-functional goals to help keep Airbnb&#39;s community safe and trusted.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Leading and driving data-driven roadmaps for the CS Safety working groups</li>\n<li>Recommending actionable solutions backed by data and metrics to product and operational problems</li>\n<li>Building and owning an insights and reporting platform that measures and improves the effectiveness of behaviours, product interfaces, and processes across the CS Safety platform and contact centre network</li>\n<li>Performing data modelling of the various entities using tools and frameworks for optimising community and agent experiences</li>\n<li>Defining and evaluating key metrics in an unstructured problem space, including measurement of the ML models that drive product development</li>\n<li>Anticipating emerging safety risks through early-warning indicators, trend analysis, predictive modelling, and scenario planning to assess operational risk</li>\n<li>Influencing data-driven decisions across business verticals in day-to-day via business reviews, scorecards, self-serve portal, OKRs, and planning among others</li>\n<li>Influencing experimentation and measurement strategies; conducting power analyses, defining exit criteria, and using statistical models to improve inference</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>A minimum of 10+ years of industry experience in business analytics and a degree (Masters or PhD) in a quantitative field (e.g., Statistics, Econometrics, Computer Science, Engineering, Mathematics, Data Science, Operations Research)</li>\n<li>Experience supporting safety, risk, Trust &amp; Safety, compliance, or employee wellbeing in high-volume call centre or customer operations environments</li>\n<li>Expert skills in SQL and expert in at least one programming language for data analysis (Python or R)</li>\n<li>Experience with non-experimental causal inference methods, experimentation, and machine learning techniques, ideally in a multi-sided platform setting</li>\n<li>Working knowledge of schema design and high-dimensional data modelling (ETL framework like Airflow)</li>\n<li>Ability to work under conditions of ambiguity in a fast-growth, sometimes uncertain and complex environment</li>\n<li>Comfortable operating independently with minimal planning, direction, and supervision</li>\n<li>Proven track record of influencing senior leaders and driving outcomes</li>\n</ul>\n<p>Experience Level: Staff Employment Type: Full-time Workplace Type: Remote Category: Engineering Industry: Technology Salary Range: $176,000-$220,000 USD Required Skills: SQL, Python, R, Machine Learning, Data Analysis, Data Modelling, Causal Inference, Experimentation, Statistical Models Preferred Skills: Data Science, Operations Research, Statistics, Econometrics, Computer Science, Engineering, Mathematics</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3ff860ce-94c","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Airbnb","sameAs":"https://www.airbnb.com/","logo":"https://logos.yubhub.co/airbnb.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/airbnb/jobs/7579193","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$176,000-$220,000 USD","x-skills-required":["SQL","Python","R","Machine Learning","Data Analysis","Data Modelling","Causal Inference","Experimentation","Statistical Models"],"x-skills-preferred":["Data Science","Operations Research","Statistics","Econometrics","Computer Science","Engineering","Mathematics"],"datePosted":"2026-04-18T15:47:55.305Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"engineering","industry":"technology","skills":"SQL, Python, R, Machine Learning, Data Analysis, Data Modelling, Causal Inference, Experimentation, Statistical Models, Data Science, Operations Research, Statistics, Econometrics, Computer Science, Engineering, Mathematics","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":176000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b7b8d06f-881"},"title":"Backend Engineer, Knowledge Graph (Rust)","description":"<p>As an Intermediate Backend Engineer on the GitLab Knowledge Graph team, you&#39;ll help build and operate a graph data service that supports GitLab Duo agents, analytics, and architecture-level features across GitLab.com, Dedicated, and Self-Managed deployments.</p>\n<p>You&#39;ll join a small, Rust-first team that values clear ownership, thoughtful system design, and rigorous thinking about data and reliability. The Knowledge Graph service is a Rust backend that builds a property graph from GitLab’s software development lifecycle (SDLC) and code data. It uses ClickHouse, NATS JetStream, and the Data Insights Platform. It exposes secure graph queries and MCP tools used by AI agents and product features.</p>\n<p>In this role, you’ll deliver features and improvements in well-scoped areas, learn the broader architecture, and contribute to reliability, observability, and operational readiness. In your first year, you’ll take clear ownership of specific components or features (for example, parts of the SDLC indexing pipeline or query paths). You’ll help reduce single points of failure with better tests and runbooks, and you’ll help the team ship analytical services that are easier to maintain and evolve over time.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Implement and iterate on backend features in the Rust-based Knowledge Graph service, including changes to the query engine, SDLC and code indexing flows, and API endpoints (including MCP endpoints) under guidance from senior and staff engineers.</li>\n</ul>\n<ul>\n<li>Help maintain integrations between Knowledge Graph and the rest of the GitLab platform, working in areas that touch GitLab Rails, the Data Insights Platform (Siphon, NATS, ClickHouse), and GitLab Duo Agent Platform.</li>\n</ul>\n<ul>\n<li>Contribute to system design discussions by proposing options, raising questions, and documenting decisions, with a focus on reliability, scalability, and maintainability for analytical graph workloads.</li>\n</ul>\n<ul>\n<li>Improve the operational maturity of the service by adding or enhancing metrics, logging, runbooks, alerts, and small readiness tasks, and by participating in on-call rotation as appropriate for your level and experience.</li>\n</ul>\n<ul>\n<li>Collaborate asynchronously with product, data, infrastructure, security, and AI counterparts to clarify requirements, align on scope, and ship features safely for customers and sustainably for the team.</li>\n</ul>\n<ul>\n<li>Use AI-assisted development workflows responsibly (for example, using Knowledge Graph-backed agents and internal Duo tooling), and share what works with the team while keeping a strong focus on code quality and correctness.</li>\n</ul>\n<ul>\n<li>Participate in code reviews, knowledge-sharing sessions, and pairing to both learn from others and help maintain consistent standards across the codebase.</li>\n</ul>\n<ul>\n<li>Contribute across the stack when needed, including occasional Ruby work for Rails integration and authorization paths, or small frontend changes related to Knowledge Graph features (for example, Software Architecture Map UI plumbing).</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Professional experience building and maintaining backend systems in production, with an understanding of reliability, maintainability, and how to support services over time (incident responses, and follow-ups, etc).</li>\n</ul>\n<ul>\n<li>Proficiency in at least one modern backend language and strong interest in Rust, with either prior Rust experience or clear evidence you can ramp quickly and deliver in a Rust-first, performance-sensitive codebase.</li>\n</ul>\n<ul>\n<li>Some exposure to distributed data or analytics systems (for example, OLAP databases, Kafka- or NATS-style messaging, or change data capture (CDC) pipelines), or strong motivation to develop those skills in this role.</li>\n</ul>\n<ul>\n<li>Interest in graph data modeling and query patterns (property graphs, multi-step (n-hop) traversals, aggregations), and willingness to learn the tools and concepts used in Knowledge Graph over time.</li>\n</ul>\n<ul>\n<li>Practical experience (or strong interest) using AI tools in day-to-day development, along with a thoughtful approach to validating outputs and integrating AI into your workflow.</li>\n</ul>\n<ul>\n<li>A language-agnostic mindset and evidence that you can pick up new languages and frameworks as needed (for example, Ruby, Go, or TypeScript/Vue where the work touches adjacent systems).</li>\n</ul>\n<ul>\n<li>Solid fundamentals in system design for your level, including the ability to reason about trade-offs, ask good questions, and align your implementation work with documented architectural decisions.</li>\n</ul>\n<ul>\n<li>Comfort working in a low-process, high-ownership environment where you take responsibility for your work, communicate progress clearly, and help refine problem statements with your teammates.</li>\n</ul>\n<ul>\n<li>Strong written communication and comfort collaborating asynchronously across time zones in an all-remote team.</li>\n</ul>\n<p>About the team:</p>\n<p>We sit within the Data Engineering organization. We&#39;re a small group of senior engineers and we work closely with partners across AI (Duo Agent Platform), analytics, infrastructure and delivery, and security because our work spans many parts of the platform. We collaborate asynchronously and optimize for strong ownership rather than a feature factory model. We each build a meaningful understanding of the system and help evolve it over time. A key challenge for us right now is scaling sustainably. That includes hardening multi-tenant behavior, maturing observability and readiness, and keeping the system healthy and maintainable as usage grows and team members take time off. At the same time, we&#39;re bringing Knowledge Graph to general availability (GA).</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b7b8d06f-881","directApply":true,"hiringOrganization":{"@type":"Organization","name":"GitLab","sameAs":"https://about.gitlab.com/","logo":"https://logos.yubhub.co/about.gitlab.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/gitlab/jobs/8437754002","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$98,000-$210,000 USD","x-skills-required":["Rust","backend systems","reliability","maintainability","distributed data","analytics systems","graph data modeling","query patterns","AI tools","system design","low-process","high-ownership"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:47:47.362Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, Canada; Remote, US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Rust, backend systems, reliability, maintainability, distributed data, analytics systems, graph data modeling, query patterns, AI tools, system design, low-process, high-ownership","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":98000,"maxValue":210000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ca34a0c3-c68"},"title":"Data Analyst, Product","description":"<p>As a Product Data Analyst on GitLab&#39;s Product Data Insights team, you&#39;ll help product teams make better decisions through trusted data and clear analysis. Working closely with Product Managers, Engineering, UX, and leadership, you&#39;ll examine customer behaviour across the customer journey to improve both the customer experience and business outcomes.</p>\n<p>In this role, you&#39;ll build business intelligence solutions, define and maintain product key performance indicators, and turn usage data into insights that support strategic decisions. This role focuses on GitLab&#39;s Platforms sections, an important area that helps ensure GitLab&#39;s infrastructure can scale over time. You&#39;ll also support our Dedicated offering by exploring usage and behavioural patterns across that deployment type.</p>\n<p>In GitLab&#39;s all-remote, asynchronous environment, you&#39;ll contribute to a transparent, values-driven culture where data storytelling helps teams understand product health and act with confidence.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Collaborate with stakeholders ranging from individual contributor Product Managers to leadership to understand business questions and define the right analytical approach.</li>\n<li>Gather data from multiple sources and connect disparate data points into a clear story about product usage, customer behaviour, and product health.</li>\n<li>Establish reporting for new products and features, including defining metrics and building repeatable analysis that teams can use to track adoption and performance.</li>\n<li>Analyze usage patterns within GitLab&#39;s Platforms sections to help teams understand how this area is performing and where improvements may be needed.</li>\n<li>Explore behavioural trends for the Dedicated offering to surface insights that can inform product decisions and the customer experience.</li>\n<li>Partner with UX to better understand user behaviour and support research with quantitative analysis.</li>\n<li>Work with Engineering teams on data collection and instrumentation approaches so product usage can be measured accurately and consistently.</li>\n<li>Collaborate with Analytics Engineers on data modelling needs for reporting and analysis, helping ensure the underlying data supports reliable decision-making.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Advanced SQL skills and the ability to use data to answer complex product questions with clarity and accuracy.</li>\n<li>Experience building business intelligence visualisations and dashboards, ideally in Tableau or a similar tool.</li>\n<li>Experience working directly with Product and Engineering teams on data creation, instrumentation, and measurement strategy.</li>\n<li>Experience partnering with Data and Analytics Engineering teams to shape data models for reporting and product analysis.</li>\n<li>Strong analytical thinking and the ability to turn product usage data into practical recommendations for stakeholders at different levels.</li>\n<li>Experience conducting and interpreting A/B tests to evaluate product changes and support evidence-based decisions.</li>\n<li>Clear written and verbal communication skills, with the ability to explain complex findings in simple language.</li>\n<li>Comfort working in an all-remote, asynchronous environment and collaborating effectively across functions while staying aligned with GitLab&#39;s values.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ca34a0c3-c68","directApply":true,"hiringOrganization":{"@type":"Organization","name":"GitLab","sameAs":"https://about.gitlab.com/","logo":"https://logos.yubhub.co/about.gitlab.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/gitlab/jobs/8455464002","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$78,400-$168,000 USD","x-skills-required":["SQL","Tableau","Business Intelligence","Data Analysis","Data Modelling","A/B Testing","Communication"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:47:40.923Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, North America"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Tableau, Business Intelligence, Data Analysis, Data Modelling, A/B Testing, Communication","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":78400,"maxValue":168000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_41c3ee08-08e"},"title":"Optimization Software Engineer","description":"<p>We are looking for a talented mid-level Software Engineer with a strong background in optimization to join our growing team at Anduril Labs. In this role, you will be instrumental in developing advanced algorithms and software solutions to tackle complex, multi-domain optimization problems critical to national defense and Anduril&#39;s autonomous systems.</p>\n<p>The ideal candidate possesses deep expertise in classical optimization algorithms, robust Python programming skills, and a solid foundation in data modeling. Experience with developing hybrid quantum optimization solutions is a plus.</p>\n<p>You will leverage state-of-the-art, GenAI-powered development tools such as Claude Code to accelerate solution development and enhance our optimization software. This role demands creative problem-solving, a self-starter mentality, and the ability to rapidly apply algorithmic theory and mathematic modeling to practical, real-world optimization challenges.</p>\n<p>You will be designing, implementing, and deploying optimization algorithms and services that integrate seamlessly into larger defense systems, working across various platforms (on-prem, cloud, and hybrid quantum computing environments).</p>\n<p>Familiarity with modeling linear and non-linear optimization problems, rapid prototyping, integrating optimization solutions into existing architectures, leveraging APIs, and utilizing open-source tools will be crucial.</p>\n<p>If you thrive in a dynamic environment that values creative problem-solving, love writing code, excel as both an individual contributor and team player, are eager to learn, and bring a can-do attitude, this role is for you.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Design, develop, and implement highly efficient optimization algorithms and software solutions to solve challenging problems in areas such as resource allocation, scheduling, routing, mission planning, control systems, and supply chain logistics.</li>\n</ul>\n<ul>\n<li>Apply classical optimization techniques (e.g., linear programming, mixed-integer linear programming, combinatorial optimization, network flow, dynamic programming, heuristics, metaheuristics) to model and explore novel approaches.</li>\n</ul>\n<ul>\n<li>Utilize GenAI tools (e.g., OpenAI Codes, Claude Code, GitHub Copilot) to rapidly prototype, refine, and test algorithmic solutions, improving development velocity and code quality.</li>\n</ul>\n<ul>\n<li>Develop robust data models and efficient data pipelines to support complex optimization problems, ensuring data integrity and efficient processing for algorithmic inputs and outputs.</li>\n</ul>\n<ul>\n<li>Collaborate with multidisciplinary teams (software engineers, data scientists, domain experts, product managers) to integrate optimization engines and services into larger defense systems and platforms.</li>\n</ul>\n<ul>\n<li>Perform rigorous testing, validation, and performance analysis of optimization solutions, ensuring scalability, reliability, and accuracy under diverse operational conditions.</li>\n</ul>\n<ul>\n<li>Participate actively in the entire Software Development Lifecycle (SDLC) from requirements gathering and design to deployment, monitoring, and maintenance.</li>\n</ul>\n<ul>\n<li>Support Anduril- and customer-funded R&amp;D efforts, contributing to technical documentation, presentations, and patent applications.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s degree in Computer Science, Software Engineering, Applied Mathematics, Operations Research, or a related quantitative field.</li>\n</ul>\n<ul>\n<li>3+ years of professional experience in software development with a dedicated focus on optimization, algorithmic problem-solving, or operations research.</li>\n</ul>\n<ul>\n<li>Experience solving optimization problems in defense, transportation, supply chain, logistics, network optimization, smart grids or similar.</li>\n</ul>\n<ul>\n<li>Expert proficiency in Python for scientific computing and robust software development.</li>\n</ul>\n<ul>\n<li>Strong theoretical and practical understanding of classical optimization algorithms (e.g., linear programming, mixed-integer linear programming, constraint programming, network flow, dynamic programming, heuristics, meta heuristics).</li>\n</ul>\n<ul>\n<li>Hands-on experience with optimization libraries and commercial/open-source solvers (e.g., SciPy Optimize, PuLP, CVXPY, Gurobi, CPLEX, OR-Tools, GEKKO).</li>\n</ul>\n<ul>\n<li>Solid experience with data modeling, data structures, and algorithms to efficiently prepare, process, and manage data for optimization problems.</li>\n</ul>\n<ul>\n<li>Demonstrable hands-on experience using GenAI tools (e.g., OpenAI Codex, Claude Code, Gemini Code Assist, GitHub Copilot, Amazon CodeWhisperer, or similar) for software development, code generation, debugging, and algorithmic exploration.</li>\n</ul>\n<ul>\n<li>Proficiency in using numerical computing libraries such as NumPy, SciPy, and Pandas.</li>\n</ul>\n<ul>\n<li>Demonstrated understanding and application of software testing principles and practices, including unit testing, integration testing, and end-to-end testing.</li>\n</ul>\n<ul>\n<li>Ability to develop, test, and deploy software effectively on Linux-based systems.</li>\n</ul>\n<ul>\n<li>Eligible to obtain and maintain an active U.S. Top Secret SCI security clearance.</li>\n</ul>\n<ul>\n<li>Experience with Git version control, build tools, and CI/CD pipelines.</li>\n</ul>\n<ul>\n<li>Strong problem-solving skills, meticulous attention to detail, and the ability to work effectively in a collaborative team environment.</li>\n</ul>\n<ul>\n<li>Excellent communication and interpersonal skills, with the ability to effectively articulate complex technical concepts to diverse audiences.</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Master&#39;s or Ph.D. in Computer Science, Applied Mathematics, Operations Research, or a closely related quantitative field.</li>\n</ul>\n<ul>\n<li>Familiarity with or a strong interest in quantum optimization algorithms, quantum computing concepts, or quantum-inspired heuristic approaches.</li>\n</ul>\n<ul>\n<li>Experience with D-Wave’s quantum annealing platform is a plus.</li>\n</ul>\n<ul>\n<li>Experience with performance-critical programming languages such as C++ or Java.</li>\n</ul>\n<ul>\n<li>Experience with cloud platforms (e.g., AWS, Azure, GCP) for deploying scalable optimization solutions or high-performance computing (HPC) environments.</li>\n</ul>\n<ul>\n<li>Prior experience in defense, aerospace, logistics, supply chain management, robotics, or manufacturing optimization domains.</li>\n</ul>\n<ul>\n<li>Familiarity with integrating machine learning models with optimization techniques (e.g., prescriptive analytics, reinforcement learning for optimization).</li>\n</ul>\n<ul>\n<li>Excellent communication skills with the ability to articulate complex technical concepts, present findings, and influence technical direction across diverse teams.</li>\n</ul>\n<ul>\n<li>Willingness to travel up to approximately 10%.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_41c3ee08-08e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5089067007","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$132,000-$198,000 USD","x-skills-required":["Python","Classical optimization algorithms","Data modeling","GenAI tools","Optimization libraries","Commercial/open-source solvers","Numerical computing libraries","Software testing principles","Linux-based systems","Git version control","Build tools","CI/CD pipelines"],"x-skills-preferred":["Quantum optimization algorithms","Quantum computing concepts","Quantum-inspired heuristic approaches","Performance-critical programming languages","Cloud platforms","High-performance computing environments","Machine learning models","Prescriptive analytics","Reinforcement learning for optimization"],"datePosted":"2026-04-18T15:47:14.850Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, District of Columbia, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Classical optimization algorithms, Data modeling, GenAI tools, Optimization libraries, Commercial/open-source solvers, Numerical computing libraries, Software testing principles, Linux-based systems, Git version control, Build tools, CI/CD pipelines, Quantum optimization algorithms, Quantum computing concepts, Quantum-inspired heuristic approaches, Performance-critical programming languages, Cloud platforms, High-performance computing environments, Machine learning models, Prescriptive analytics, Reinforcement learning for optimization","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":132000,"maxValue":198000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5a29684d-d2d"},"title":"Senior Analytics Developer - Platform Analytics","description":"<p>We&#39;re looking for a Senior Analytics Engineer to join our Platform Analytics team. In this role, you&#39;ll design and evolve core analytical data models that power trusted, self-service analytics across Elastic. You&#39;ll shape the underlying structure of our analytics layer,aligning definitions, improving usability, and enabling faster, more reliable insights for teams across the company.</p>\n<p>This role goes beyond delivering within existing patterns. You&#39;ll improve foundational modeling decisions, reducing rework, and establishing standards that scale.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design and build core analytical data models in BigQuery using dbt</li>\n<li>Refactor and restructure existing models to improve clarity, consistency, and ease of use</li>\n<li>Partner directly with solution teams to translate business needs into well-defined, reusable data models</li>\n<li>Define and enforce modeling standards, conventions, and layer contracts</li>\n<li>Standardize identifiers and business logic early in the transformation layer to reduce downstream complexity</li>\n<li>Centralize shared business rules and definitions to enable consistent, trusted analytics</li>\n<li>Explore and apply AI-assisted approaches, to improve analytics workflows</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Strong expertise in Python and SQL and analytics data modeling</li>\n<li>5+ years of experience in analytics engineering, data engineering, or a related role</li>\n<li>Hands-on experience designing analytics layers in BigQuery and dbt</li>\n<li>Proven ability to create analyst-friendly data models with clear structure and predictable behavior</li>\n<li>Experience setting standards and influencing how data is modeled and consumed across teams</li>\n<li>Strong analytical thinking and problem-solving skills</li>\n<li>Clear written and verbal communication skills</li>\n</ul>\n<p><strong>Bonus Points</strong></p>\n<ul>\n<li>Experience working in a distributed or remote-first environment</li>\n<li>Familiarity with metric definitions, or semantic layers</li>\n<li>Experience applying AI or automation to analytics or data modeling workflows</li>\n</ul>\n<p>Compensation for this role is in the form of base salary. This role does not have a variable compensation component. The typical starting salary range for new hires in this role is $128,300-$203,000 CAD.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5a29684d-d2d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Elastic, the Search AI Company","sameAs":"https://www.elastic.co/","logo":"https://logos.yubhub.co/elastic.co.png"},"x-apply-url":"https://job-boards.greenhouse.io/elastic/jobs/7614524","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$128,300-$203,000 CAD","x-skills-required":["Python","SQL","analytics data modeling","BigQuery","dbt","AI-assisted approaches"],"x-skills-preferred":["metric definitions","semantic layers","AI or automation"],"datePosted":"2026-04-18T15:46:47.387Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Canada"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, SQL, analytics data modeling, BigQuery, dbt, AI-assisted approaches, metric definitions, semantic layers, AI or automation","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":128300,"maxValue":203000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0a3dc5a7-8d9"},"title":"Senior Analytics Engineer","description":"<p>We are seeking a Senior Analytics Engineer to support the Enterprise by building reliable, well-modeled, and trusted data for reporting, decision-making, and emerging AI use cases.</p>\n<p>As a Senior Analytics Engineer, you will design scalable data models, define consistent business logic, and help establish a strong semantic foundation that enables both human analytics and machine-driven intelligence.</p>\n<p>You will partner closely with Finance, People and Company Operations stakeholders, Data Analysts, and Data Engineers to ensure data is accurate, consistent, and easy to consume; whether through dashboards, self-service exploration, or AI-powered workflows.</p>\n<p>Responsibilities:</p>\n<p>Data Modeling &amp; Semantics</p>\n<ul>\n<li>Design, build, and maintain scalable data models using dbt and Snowflake</li>\n<li>Define and standardize core Finance, HR and Enterprise level metrics (e.g., revenue, ARR, billing, Attrition, Executive Insights, Security) with clear, governed logic</li>\n<li>Establish consistent modeling patterns, naming conventions, and semantic clarity across datasets</li>\n<li>Contribute to a shared semantic layer that supports both analytics and AI use cases</li>\n</ul>\n<p>AI-Ready Data &amp; Snowflake Ecosystem</p>\n<ul>\n<li>Prepare high-quality, well-governed datasets for use with Snowflake Cortex and Snowflake Intelligence</li>\n<li>Enable structured data foundations that support LLM-powered use cases, semantic querying, and intelligent applications</li>\n<li>Ensure data is context-rich, well-documented, and aligned with business meaning to improve AI accuracy and trust</li>\n</ul>\n<p>Data Quality, Governance &amp; Trust</p>\n<ul>\n<li>Implement robust testing, validation, and documentation practices in dbt</li>\n<li>Ensure consistency across reports and dashboards through shared definitions and reusable models</li>\n<li>Apply data governance best practices, including access controls, lineage, and auditability</li>\n<li>Partner across teams to establish clear ownership and accountability for data assets</li>\n</ul>\n<p>Collaboration &amp; Delivery</p>\n<ul>\n<li>Partner with Finance, Analysts, and cross-functional stakeholders to translate business needs into data solutions</li>\n<li>Support self-service analytics by building intuitive, reusable datasets</li>\n<li>Contribute to scalable data workflows that balance immediate business needs with long-term maintainability</li>\n<li>Work within an agile environment, contributing to planning, prioritization, and continuous improvement</li>\n</ul>\n<p>AI and Data Mindset</p>\n<ul>\n<li>Demonstrate an AI-first mindset, thinking beyond data models and dashboards to how data can power intelligent systems and decision-making</li>\n<li>Understand the importance of well-modeled, well-documented, and semantically clear data for AI and LLM-based use cases</li>\n<li>A level of comfort leveraging AI-assisted workflows to improve productivity, code quality, and consistency</li>\n<li>Curiosity for emerging capabilities in platforms like Snowflake Cortex and Snowflake Intelligence, and how they can be applied to Enterprise analytics</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>5–8+ years of experience in Analytics Engineering, Data Engineering, or similar roles</li>\n<li>Strong SQL skills and experience building analytics-ready data models</li>\n<li>Mentorship &amp; Engineering Excellence: Mentorship, raising the technical bar, establishing organization-wide standards for dbt/SQL quality and CI/CD</li>\n<li>Hands-on experience with dbt and Snowflake or other ETL, Modeling and database platforms</li>\n<li>Solid understanding of data modeling principles, including dimensional modeling and semantic design</li>\n<li>Ability to navigate highly ambiguous business challenges, translating vague, complex, or competing goals from executive stakeholders into clear, actionable, and robust data solutions</li>\n<li>Experience translating business requirements into clear, maintainable data logic</li>\n<li>Familiarity with SaaS metrics and Finance and People data (e.g., ARR, revenue recognition, billing, attrition etc.)</li>\n<li>Experience with data quality, testing, and documentation best practices</li>\n<li>Exposure to Python, R, or data processing frameworks (e.g., PySpark) is a plus</li>\n<li>Experience with BI tools such as Tableau or Looker</li>\n<li>Strong communication skills and ability to work across technical and business teams</li>\n</ul>\n<p>What you can look forward to as an Okta employee!</p>\n<ul>\n<li>Amazing Benefits</li>\n<li>Making Social Impact</li>\n<li>Fostering Diversity, Equity, Inclusion and Belonging at Okta</li>\n<li>Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0a3dc5a7-8d9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7818510","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["dbt","Snowflake","SQL","data modeling","dimensional modeling","semantic design","ETL","data quality","testing","documentation","Python","R","PySpark","Tableau","Looker"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:30.556Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bellevue, Washington; Chicago, Illinois; San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"dbt, Snowflake, SQL, data modeling, dimensional modeling, semantic design, ETL, data quality, testing, documentation, Python, R, PySpark, Tableau, Looker"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_25f010f0-7d1"},"title":"Data Engineer","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Brex’s AI-native automation and world-class service eliminate manual expense and accounting tasks for customers so they can focus on what matters most. Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream. We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Data at Brex</p>\n<p>Our Scientists and Engineers work together to make data , and insights derived from data , a core asset across Brex. But it&#39;s more than just crunching numbers. The Data team at Brex develops infrastructure, statistical models, and products using data. Our work is ingrained in Brex&#39;s decision-making process, the efficiency of our operations, our risk management policies, and the unparalleled experience we provide our customers.</p>\n<p>What You’ll Do</p>\n<p>As a Data Engineer at Brex, you will be a core contributor in transforming raw data into actionable insights for various departments across the organization. You&#39;ll collaborate closely with Data Scientists, Software Engineers, and business units to create efficient data models, pipelines, and analytics frameworks that drive the business forward. You also play a leading role in the design, implementation, and maintenance of Core Data tables, our high-quality, curated data source for a wide range of analytic applications.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our San Francisco office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, build, and maintain data models and pipelines that scale with the growing number of services, products, and changes in the company.</li>\n</ul>\n<ul>\n<li>Collaborate closely with Data Scientists, Data Analysts, and Business teams to understand their data needs, translating them into robust, efficient, scalable data solutions that enable ease of predictive analytics, data analysis, and metrics formulation.</li>\n</ul>\n<ul>\n<li>Maintain data documentation and definitions, building and ensuring that source-of-truth tables remain high quality for data science and reporting applications.</li>\n</ul>\n<ul>\n<li>Develop and enable integration with various data sources, allowing for more data-driven initiatives across the company.</li>\n</ul>\n<ul>\n<li>Apply best practices in data management to ensure the reliability and robustness of data utilized across various analytics applications.</li>\n</ul>\n<ul>\n<li>Set and proliferate company-wide standards for data relating to structure, quality, and expectations.</li>\n</ul>\n<ul>\n<li>Act as a liaison between the technical and non-technical teams, bridging gaps and ensuring that data solutions align with business objectives.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>3+ years of experience in Data Engineering, Data Analytics, or a related field such as Analytics Engineering.</li>\n</ul>\n<ul>\n<li>2+ years of experience working with modern data transformation tools like DBT.</li>\n</ul>\n<ul>\n<li>Advanced knowledge of databases and SQL with the ability to efficiently stage, process, and transform data.</li>\n</ul>\n<ul>\n<li>Experience integrating and orchestrating data workflows with various modern data tools and systems.</li>\n</ul>\n<ul>\n<li>Experience with data modeling, ETL/ELT processes, and data warehousing solutions.</li>\n</ul>\n<ul>\n<li>Experience working with a data warehouse such as Snowflake.</li>\n</ul>\n<ul>\n<li>Experience with a data workflow orchestrator tool such as Airflow.</li>\n</ul>\n<ul>\n<li>Experience with a programming language such as Python.</li>\n</ul>\n<ul>\n<li>Familiarity with BI tools such as Looker, Tableau, or similar platforms is a plus.</li>\n</ul>\n<ul>\n<li>Exceptional quantitative and analytical skills.</li>\n</ul>\n<ul>\n<li>Strong communication skills and ability to collaborate with various stakeholders, both technical and non-technical.</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $120,800 - $151,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_25f010f0-7d1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8366850002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$120,800 - $151,000","x-skills-required":["DBT","databases","SQL","data modeling","ETL/ELT processes","data warehousing solutions","Snowflake","Airflow","Python","BI tools","Looker","Tableau"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:18.514Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"DBT, databases, SQL, data modeling, ETL/ELT processes, data warehousing solutions, Snowflake, Airflow, Python, BI tools, Looker, Tableau","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120800,"maxValue":151000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1d204fa1-067"},"title":"Data Engineer","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Data at Brex</p>\n<p>Our Scientists and Engineers work together to make data , and insights derived from data , a core asset across Brex. But it&#39;s more than just crunching numbers. The Data team at Brex develops infrastructure, statistical models, and products using data. Our work is ingrained in Brex&#39;s decision-making process, the efficiency of our operations, our risk management policies, and the unparalleled experience we provide our customers.</p>\n<p>What You’ll Do</p>\n<p>As a Data Engineer at Brex, you will be a core contributor in transforming raw data into actionable insights for various departments across the organization. You&#39;ll collaborate closely with Data Scientists, Software Engineers, and business units to create efficient data models, pipelines, and analytics frameworks that drive the business forward. You also play a leading role in the design, implementation, and maintenance of Core Data tables, our high-quality, curated data source for a wide range of analytic applications.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our Seattle office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, build, and maintain data models and pipelines that scale with the growing number of services, products, and changes in the company.</li>\n</ul>\n<ul>\n<li>Collaborate closely with Data Scientists, Data Analysts, and Business teams to understand their data needs, translating them into robust, efficient, scalable data solutions that enable ease of predictive analytics, data analysis, and metrics formulation.</li>\n</ul>\n<ul>\n<li>Maintain data documentation and definitions, building and ensuring that source-of-truth tables remain high quality for data science and reporting applications.</li>\n</ul>\n<ul>\n<li>Develop and enable integration with various data sources, allowing for more data-driven initiatives across the company.</li>\n</ul>\n<ul>\n<li>Apply best practices in data management to ensure the reliability and robustness of data utilized across various analytics applications.</li>\n</ul>\n<ul>\n<li>Set and proliferate company-wide standards for data relating to structure, quality, and expectations.</li>\n</ul>\n<ul>\n<li>Act as a liaison between the technical and non-technical teams, bridging gaps and ensuring that data solutions align with business objectives.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>3+ years of experience in Data Engineering, Data Analytics, or a related field such as Analytics Engineering.</li>\n</ul>\n<ul>\n<li>2+ years of experience working with modern data transformation tools like DBT.</li>\n</ul>\n<ul>\n<li>Advanced knowledge of databases and SQL with the ability to efficiently stage, process, and transform data.</li>\n</ul>\n<ul>\n<li>Experience integrating and orchestrating data workflows with various modern data tools and systems.</li>\n</ul>\n<ul>\n<li>Experience with data modeling, ETL/ELT processes, and data warehousing solutions.</li>\n</ul>\n<ul>\n<li>Experience working with a data warehouse such as Snowflake.</li>\n</ul>\n<ul>\n<li>Experience with a data workflow orchestrator tool such as Airflow.</li>\n</ul>\n<ul>\n<li>Experience with a programming language such as Python.</li>\n</ul>\n<ul>\n<li>Familiarity with BI tools such as Looker, Tableau, or similar platforms is a plus.</li>\n</ul>\n<ul>\n<li>Exceptional quantitative and analytical skills.</li>\n</ul>\n<ul>\n<li>Strong communication skills and ability to collaborate with various stakeholders, both technical and non-technical.</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $120,800 - $151,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1d204fa1-067","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8510493002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$120,800 - $151,000","x-skills-required":["DBT","databases","SQL","data modeling","ETL/ELT processes","data warehousing solutions","Snowflake","Airflow","Python","BI tools","Looker","Tableau"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:02.393Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Seattle, Washington, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"DBT, databases, SQL, data modeling, ETL/ELT processes, data warehousing solutions, Snowflake, Airflow, Python, BI tools, Looker, Tableau","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120800,"maxValue":151000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e00b7052-70b"},"title":"Senior Business Systems Analyst, Finance Systems","description":"<p>We are seeking an experienced Senior Business Systems Analyst to join our Finance Systems team at Anthropic. In this role, you will serve as the internal functional lead for our Workday Financials implementation, owning the design and configuration of the Financial Data Model (FDM), Chart of Accounts, and dimensional structures that will serve as the source of truth for financial reporting.</p>\n<p>You will develop Prism Analytics and Accounting Center solutions, gather requirements and build reporting capabilities, and collaborate closely with cross-functional teams to drive the successful adoption of our new ERP platform.</p>\n<p>This is a critical role that will directly shape how Anthropic&#39;s finance organisation operates as we scale toward public company readiness. You will work at the intersection of finance domain expertise and technical implementation, partnering with the implementation partner, engineering teams, and finance stakeholders to build a world-class financial systems foundation.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>ERP Core Financials Implementation: Serve as internal functional lead for Workday Financials implementation, partnering with consultants to drive configuration decisions, validate designs, and ensure business requirements are met</li>\n</ul>\n<ul>\n<li>Financial Data Model (FDM) Design: Own the design and configuration of Chart of Accounts, Worktags, dimensional hierarchies, and Accounting Books that will serve as the source of truth for all financial reporting, ensuring support for both GAAP and Management reporting requirements</li>\n</ul>\n<ul>\n<li>Prism Analytics Development: Develop and maintain Prism/Accounting Center solutions from source analysis and ingestion design through build, testing, cutover, and hypercare, including integration with external data sources like BigQuery and Pigment</li>\n</ul>\n<ul>\n<li>Requirements Gathering &amp; Reporting: Gather business requirements from Finance, Accounting, and FP&amp;A stakeholders, translating them into hands-on development of executive reporting, dashboards, and analytics solutions</li>\n</ul>\n<ul>\n<li>Workshop Participation &amp; Solution Design: Participate in implementation workshops, challenge requirements, and translate business needs into buildable designs and testable acceptance criteria; manage defects and data quality issues throughout the project lifecycle</li>\n</ul>\n<ul>\n<li>Cross-Functional Collaboration: Collaborate with Integrations, Security, and Financials configuration teams to align master data, journals, controls, and performance service level agreements; partner with Data Infrastructure and BizTech teams on system integrations</li>\n</ul>\n<ul>\n<li>Cutover &amp; Hypercare Planning: Prepare cutover plans, data migration strategies, reconciliation frameworks, and hypercare plans; document data lineage, controls, and audit artifacts to support SOX compliance requirements</li>\n</ul>\n<ul>\n<li>Platform Expansion &amp; Adoption: Work closely with engineering teams and business stakeholders to drive ongoing expansion and adoption of the Workday platform, identifying opportunities for process improvement and automation</li>\n</ul>\n<p>You may be a good fit if you:</p>\n<ul>\n<li>Have 8+ years of experience in finance systems, ERP implementation, or business systems analysis roles, with at least 5 years of hands-on Workday Financials experience</li>\n</ul>\n<ul>\n<li>Possess deep expertise in Workday Financial Data Model (FDM), including Chart of Accounts design, Worktags configuration, dimensional hierarchies, and Accounting Books setup</li>\n</ul>\n<ul>\n<li>Have strong experience with Workday Prism Analytics, including data modeling, source integration, calculated fields, and report development</li>\n</ul>\n<ul>\n<li>Are skilled at translating complex business requirements into technical solutions, bridging the gap between finance stakeholders and technical implementation teams</li>\n</ul>\n<ul>\n<li>Have experience with full ERP implementation lifecycles, including requirements gathering, configuration, testing, data migration, cutover planning, and hypercare</li>\n</ul>\n<ul>\n<li>Possess strong understanding of financial accounting processes including General Ledger, multi-entity consolidation, intercompany accounting, and management reporting</li>\n</ul>\n<ul>\n<li>Have excellent stakeholder management and communication skills, with ability to work effectively with finance leadership, accounting teams, and technical partners</li>\n</ul>\n<ul>\n<li>Demonstrate strong analytical and problem-solving skills with attention to detail and commitment to data accuracy and integrity</li>\n</ul>\n<ul>\n<li>Are comfortable working in fast-paced, high-growth environments with evolving requirements and tight timelines</li>\n</ul>\n<p>Strong candidates may also have:</p>\n<ul>\n<li>Background in accounting, finance, or CPA certification with understanding of GAAP/IFRS reporting requirements</li>\n</ul>\n<ul>\n<li>Experience with Workday Accounting Center for complex journal automation and subledger accounting</li>\n</ul>\n<ul>\n<li>Technical proficiency with SQL, Python, or scripting languages for data analysis and integration support</li>\n</ul>\n<ul>\n<li>Experience integrating Workday with external data platforms such as BigQuery or cloud data warehouses</li>\n</ul>\n<ul>\n<li>Knowledge of SOX compliance requirements and internal controls for financial systems</li>\n</ul>\n<ul>\n<li>Experience with EPM/FP&amp;A systems such as Pigment, Anaplan, or Adaptive Planning and their integration with ERP</li>\n</ul>\n<ul>\n<li>Prior experience at high-growth technology companies scaling toward IPO readiness</li>\n</ul>\n<ul>\n<li>Familiarity with Workday HCM and understanding of HCM-Financials integration points</li>\n</ul>\n<ul>\n<li>Experience with data migration tools, ETL processes, and reconciliation frameworks for ERP implementations</li>\n</ul>\n<p>The annual compensation range for this role is $205,000-$265,000 USD.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e00b7052-70b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.co/","logo":"https://logos.yubhub.co/anthropic.co.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/4991194008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$205,000-$265,000 USD","x-skills-required":["Workday Financials","Workday Financial Data Model (FDM)","Chart of Accounts design","Worktags configuration","Dimensional hierarchies","Accounting Books setup","Prism Analytics","Data modeling","Source integration","Calculated fields","Report development","ERP implementation lifecycles","Requirements gathering","Configuration","Testing","Data migration","Cutover planning","Hypercare","Financial accounting processes","General Ledger","Multi-entity consolidation","Intercompany accounting","Management reporting","Stakeholder management","Communication skills","Analytical skills","Problem-solving skills","Data accuracy and integrity"],"x-skills-preferred":["SQL","Python","Scripting languages","BigQuery","Cloud data warehouses","SOX compliance requirements","Internal controls","EPM/FP&A systems","Pigment","Anaplan","Adaptive Planning","ERP integration","High-growth technology companies","IPO readiness","Workday HCM","HCM-Financials integration points","Data migration tools","ETL processes","Reconciliation frameworks"],"datePosted":"2026-04-18T15:45:44.214Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA | Seattle, WA"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"Workday Financials, Workday Financial Data Model (FDM), Chart of Accounts design, Worktags configuration, Dimensional hierarchies, Accounting Books setup, Prism Analytics, Data modeling, Source integration, Calculated fields, Report development, ERP implementation lifecycles, Requirements gathering, Configuration, Testing, Data migration, Cutover planning, Hypercare, Financial accounting processes, General Ledger, Multi-entity consolidation, Intercompany accounting, Management reporting, Stakeholder management, Communication skills, Analytical skills, Problem-solving skills, Data accuracy and integrity, SQL, Python, Scripting languages, BigQuery, Cloud data warehouses, SOX compliance requirements, Internal controls, EPM/FP&A systems, Pigment, Anaplan, Adaptive Planning, ERP integration, High-growth technology companies, IPO readiness, Workday HCM, HCM-Financials integration points, Data migration tools, ETL processes, Reconciliation frameworks","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":205000,"maxValue":265000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_533ef325-495"},"title":"Director Product Management, Banking + AP","description":"<p>Join Brex, the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. As Director of Product Management, Banking + AP, you will define and drive the strategy for one of Brex&#39;s most critical domains: the Operating Account platform - the combination of Banking and Accounts Payable that makes Brex the place customers run payments from day one.</p>\n<p>Brex&#39;s banking product is used by tens of thousands of businesses to store $10B+ in total deposits, and our AP platform moves over $1B across hundreds of thousands of transactions each month. This role will own the end-to-end product vision and execution across customer accounts, money movement, treasury capabilities, liquidity management, accounts payable workflows, and regulatory-compliant banking services.</p>\n<p>As a member of the Product team, you will be at the forefront of Brex&#39;s mission to empower employees anywhere to make better financial decisions. With a deep understanding of the business, you will identify and scope out the most impactful opportunities for Brex to tackle. You will be responsible for aligning cross-functional teams , such as Engineering, Legal, Compliance, and Design , on key decisions.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Define and drive Operating Account strategy - Own the multi-year product vision and roadmap for Brex&#39;s Operating Account platform spanning Banking and Accounts Payable , aligned to company-level priorities.</li>\n<li>Identify and prioritize high-leverage opportunities that drive AP attach, deepen operating behavior, and grow durable DDA balances through payment activity rather than balance-seeking alone.</li>\n<li>Identify customer challenges across increasing coordination and control complexity, from admin-initiated payments and centralized AP through decentralized, multi-stakeholder procurement workflows.</li>\n<li>Balance innovation with regulatory rigor, ensuring long-term scalability and resilience.</li>\n</ul>\n<ul>\n<li>Lead complex, regulated systems across Banking and AP - Oversee core banking and AP domains including accounts, ledger systems, payment rails (ACH, wires, RTP, international), bill pay, vendor management, treasury workflows, and embedded financial services.</li>\n<li>Drive the integration of banking and AP capabilities into a unified operating experience, ensuring that money movement, vendor payments, and account management work seamlessly together.</li>\n<li>Engage deeply with Engineering and Risk in discussions around APIs, ledger design, reconciliation, data models, and risk controls, to design reliable and scalable systems.</li>\n</ul>\n<ul>\n<li>Build and lead a high-performing PM team - Manage and develop a team of Product Managers spanning banking and AP, fostering ownership, clarity, and high standards.</li>\n<li>Elevate product craft across the organization, being at the forefront of how the role is changing as we incorporate agents - Establish strong operating rhythms for planning, prioritization, and cross-functional alignment.</li>\n</ul>\n<ul>\n<li>Deliver measurable business impact - Define success metrics tied to company outcomes such as operating account attach rate, DDA balance growth, AP transaction volume, revenue expansion, margin improvement, customer retention, risk loss rates, and operational efficiency.</li>\n<li>Drive structured decision-making using data, experimentation, and financial modeling , with a sharp focus on the relationship between AP behavior and balance durability.</li>\n<li>Hold teams accountable to clear impact targets and measurable results.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>10+ years of product management experience, including leading and developing high-performing PM teams and building strong product cultures.</li>\n<li>Proven track record of setting and executing multi-quarter or multi-year strategies that delivered measurable business impact (growth, retention, revenue expansion, cost efficiency, or adoption).</li>\n<li>Strong business and systems thinking , able to translate complex operational or infrastructure challenges into simple customer experiences that drive adoption and long-term platform stickiness.</li>\n<li>Data-driven decision maker with experience defining success metrics, modeling business impact, and using SQL or similar tools to inform priorities and tradeoffs.</li>\n<li>Exceptional communicator who creates clarity in complex environments , distilling strategy, tradeoffs, and decisions into narratives that align executives and mobilize teams.</li>\n<li>Experience building trusted, mission-critical products where reliability, accuracy, and customer confidence are non-negotiable.</li>\n<li>High standards for product rigor and execution quality , consistently raising the bar for clarity of thinking, prioritization, and customer impact.</li>\n</ul>\n<p>Bonus Points:</p>\n<ul>\n<li>Experience building or scaling accounts payable, payments, or money movement workflows, especially in B2B environments.</li>\n<li>Experience with financial infrastructure, fintech platforms, or regulated products.</li>\n<li>Familiarity with distributed systems, APIs, ledgers, payment rails, or financial data models.</li>\n<li>Experience improving or automating procurement, bill pay, vendor management, or operational finance workflows.</li>\n<li>Track record of driving deposit growth, balance-based revenue, or usage-driven platform adoption.</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $340,000 - $425,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_533ef325-495","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8439927002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$340,000 - $425,000","x-skills-required":["product management","banking","accounts payable","money movement","treasury capabilities","liquidity management","financial data models","APIs","ledger design","reconciliation","risk controls","SQL","data modeling","financial modeling","data analysis","communication","leadership","team management","product development","product launch","product growth","customer acquisition","customer retention","customer satisfaction","operational efficiency","cost reduction","revenue growth","margin improvement"],"x-skills-preferred":["fintech","regulated products","distributed systems","payment rails","vendor management","procurement","bill pay","operational finance","deposit growth","balance-based revenue","usage-driven platform adoption"],"datePosted":"2026-04-18T15:45:19.941Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Seattle, Washington, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"product management, banking, accounts payable, money movement, treasury capabilities, liquidity management, financial data models, APIs, ledger design, reconciliation, risk controls, SQL, data modeling, financial modeling, data analysis, communication, leadership, team management, product development, product launch, product growth, customer acquisition, customer retention, customer satisfaction, operational efficiency, cost reduction, revenue growth, margin improvement, fintech, regulated products, distributed systems, payment rails, vendor management, procurement, bill pay, operational finance, deposit growth, balance-based revenue, usage-driven platform adoption","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":340000,"maxValue":425000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7a6a5e65-740"},"title":"Data Analyst III","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream. We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Data at Brex</p>\n<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations. Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>\n<p>What you’ll do</p>\n<p>As a senior Data Analyst (DA III), you will own the end-to-end analytics lifecycle for one or more business areas at Brex. You’ll go beyond building dashboards,you’ll frame the right questions, design rigorous analyses, apply statistical methods, and translate your findings into clear recommendations for leadership. You will also serve as a technical leader on the Data Analytics team, mentoring more junior analysts and helping define the standards and best practices that elevate the team’s work.</p>\n<p>This role sits at the intersection of analytics, analytics engineering, and business strategy. You’ll work in a modern data stack environment and partner closely with Data Scientists, Data Engineers, and senior leaders across the organization.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our New York office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities</p>\n<ul>\n<li>Own the analytics lifecycle for assigned business areas: from problem framing and data sourcing through analysis, insight generation, and stakeholder presentation.</li>\n</ul>\n<ul>\n<li>Build and maintain dashboards and self-service reporting tools that enable business teams to independently track performance, identify risks, and make data-driven decisions.</li>\n</ul>\n<ul>\n<li>Write production-quality SQL and Python code to extract, transform, and analyze data at scale.</li>\n</ul>\n<ul>\n<li>Collaborate with Data Engineers and Data Scientists to develop and maintain analytical data models, improve data pipelines, and ensure data quality across the organization.</li>\n</ul>\n<ul>\n<li>Partner with leadership across Sales, Operations, Product, Finance, and other departments to identify high-impact analytical opportunities and deliver actionable recommendations.</li>\n</ul>\n<ul>\n<li>Mentor other data analysts and contribute to the development of team standards, documentation, code review practices, and analytical frameworks.</li>\n</ul>\n<ul>\n<li>Proactively identify gaps in data infrastructure, propose improvements, and contribute to the evolution of the team’s tooling and processes.</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>5+ years of experience in data analytics, business intelligence, or a related quantitative role.</li>\n</ul>\n<ul>\n<li>3+ years of experience partnering directly with Sales, Operations, Product, or equivalent business teams as an embedded analytics partner.</li>\n</ul>\n<ul>\n<li>Advanced SQL proficiency, including CTEs, window functions, performance optimization, and working across complex data models.</li>\n</ul>\n<ul>\n<li>Proficiency in Python for data analysis, automation, and modeling (Pandas, NumPy, scikit-learn, or similar).</li>\n</ul>\n<ul>\n<li>Experience with cloud data warehouses, particularly Snowflake (BigQuery and Databricks also valued).</li>\n</ul>\n<ul>\n<li>Hands-on experience with BI and data visualization tools (Looker, Tableau, Hex, or similar).</li>\n</ul>\n<ul>\n<li>Strong stakeholder management skills,proven ability to present complex technical findings to non-technical audiences.</li>\n</ul>\n<ul>\n<li>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</li>\n</ul>\n<p>Bonus points</p>\n<ul>\n<li>Demonstrated experience applying statistical methods to business problems (e.g., regression, classification, A/B testing).</li>\n</ul>\n<ul>\n<li>Experience with dbt for data modeling and transformation.</li>\n</ul>\n<ul>\n<li>Experience building and maintaining data pipelines using orchestration tools such as Airflow.</li>\n</ul>\n<ul>\n<li>Experience working with APIs for data ingestion and integration.</li>\n</ul>\n<ul>\n<li>Familiarity with version control systems (Git).</li>\n</ul>\n<ul>\n<li>Experience in fintech, financial services, or payments.</li>\n</ul>\n<ul>\n<li>Track record of leading cross-functional analytics projects from scoping through delivery.</li>\n</ul>\n<p>Compensation</p>\n<p>The expected salary range for this role is $114,192 - $142,740. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7a6a5e65-740","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8463704002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$114,192 - $142,740","x-skills-required":["SQL","Python","Cloud data warehouses","BI and data visualization tools","Stakeholder management","Generative AI and LLM-based tools"],"x-skills-preferred":["Statistical methods","dbt for data modeling and transformation","Orchestration tools","APIs for data ingestion and integration","Version control systems","Fintech, financial services, or payments"],"datePosted":"2026-04-18T15:45:08.268Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"SQL, Python, Cloud data warehouses, BI and data visualization tools, Stakeholder management, Generative AI and LLM-based tools, Statistical methods, dbt for data modeling and transformation, Orchestration tools, APIs for data ingestion and integration, Version control systems, Fintech, financial services, or payments","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":114192,"maxValue":142740,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_73e4cd62-bb4"},"title":"Director Product Management, Banking + AP","description":"<p>Join Brex as Director of Product Management, Banking + AP, and define and drive the strategy for one of Brex&#39;s most critical domains: the Operating Account platform , the combination of Banking and Accounts Payable that makes Brex the place customers run payments from day one.</p>\n<p>As a key member of the Product team, you will oversee the end-to-end product vision and execution across customer accounts, money movement, treasury capabilities, liquidity management, accounts payable workflows, and regulatory-compliant banking services.</p>\n<p>Your work will shape Brex&#39;s competitive advantage in embedded banking and financial operations , making Brex indispensable to how businesses move and manage money, and enabling durable deposit growth as a natural outcome.</p>\n<p>Responsibilities: Define and drive Operating Account strategy Own the multi-year product vision and roadmap for Brex&#39;s Operating Account platform spanning Banking and Accounts Payable , aligned to company-level priorities. Identify and prioritize high-leverage opportunities that drive AP attach, deepen operating behavior, and grow durable DDA balances through payment activity rather than balance-seeking alone. Balance innovation with regulatory rigor, ensuring long-term scalability and resilience.</p>\n<p>Lead complex, regulated systems across Banking and AP Oversee core banking and AP domains including accounts, ledger systems, payment rails (ACH, wires, RTP, international), bill pay, vendor management, treasury workflows, and embedded financial services. Drive the integration of banking and AP capabilities into a unified operating experience, ensuring that money movement, vendor payments, and account management work seamlessly together. Engage deeply with Engineering and Risk in discussions around APIs, ledger design, reconciliation, data models, and risk controls, to design reliable and scalable systems.</p>\n<p>Build and lead a high-performing PM team Manage and develop a team of Product Managers spanning banking and AP, fostering ownership, clarity, and high standards. Elevate product craft across the organization, being at the forefront of how the role is changing as we incorporate agents Establish strong operating rhythms for planning, prioritization, and cross-functional alignment.</p>\n<p>Deliver measurable business impact Define success metrics tied to company outcomes such as operating account attach rate, DDA balance growth, AP transaction volume, revenue expansion, margin improvement, customer retention, risk loss rates, and operational efficiency. Drive structured decision-making using data, experimentation, and financial modeling , with a sharp focus on the relationship between AP behavior and balance durability. Hold teams accountable to clear impact targets and measurable results.</p>\n<p>Requirements 10+ years of product management experience, including leading and developing high-performing PM teams and building strong product cultures. Proven track record of setting and executing multi-quarter or multi-year strategies that delivered measurable business impact (growth, retention, revenue expansion, cost efficiency, or adoption). Strong business and systems thinking , able to translate complex operational or infrastructure challenges into simple customer experiences that drive adoption and long-term platform stickiness. Data-driven decision maker with experience defining success metrics, modeling business impact, and using SQL or similar tools to inform priorities and tradeoffs. Exceptional communicator who creates clarity in complex environments , distilling strategy, tradeoffs, and decisions into narratives that align executives and mobilize teams. Experience building trusted, mission-critical products where reliability, accuracy, and customer confidence are non-negotiable. High standards for product rigor and execution quality , consistently raising the bar for clarity of thinking, prioritization, and customer impact.</p>\n<p>Bonus Points Experience building or scaling accounts payable, payments, or money movement workflows, especially in B2B environments. Experience with financial infrastructure, fintech platforms, or regulated products. Familiarity with distributed systems, APIs, ledgers, payment rails, or financial data models. Experience improving or automating procurement, bill pay, vendor management, or operational finance workflows. Track record of driving deposit growth, balance-based revenue, or usage-driven platform adoption.</p>\n<p>Compensation The expected salary range for this role is $340,000 - $425,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_73e4cd62-bb4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8439924002","x-work-arrangement":"hybrid","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":"$340,000 - $425,000","x-skills-required":["Product Management","Banking","Accounts Payable","Financial Operations","Regulatory Compliance","APIs","Ledger Design","Reconciliation","Data Models","Risk Controls","SQL","Financial Modeling","Data-Driven Decision Making","Communication","Leadership","Team Management","Product Development","Financial Analysis","Business Strategy","Operational Efficiency"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:44:55.997Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"Product Management, Banking, Accounts Payable, Financial Operations, Regulatory Compliance, APIs, Ledger Design, Reconciliation, Data Models, Risk Controls, SQL, Financial Modeling, Data-Driven Decision Making, Communication, Leadership, Team Management, Product Development, Financial Analysis, Business Strategy, Operational Efficiency","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":340000,"maxValue":425000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_205a5f25-1f0"},"title":"Senior Manager, Infrastructure Data Science","description":"<p>Databricks is looking for a Senior Manager, Infrastructure Data Science to shape the future of Databricks infrastructure through data science. You will tackle some of the most complex challenges related to capacity planning, performance optimisation, reliability engineering, infrastructure efficiency, and customer experience.</p>\n<p>At Databricks, we enable data teams to solve the world&#39;s toughest problems by building and running the world&#39;s best data and AI infrastructure platform.</p>\n<p>As a Senior Manager, Infrastructure Data Science, you will lead a team of data scientists and work directly in partnership with engineering leaders to empower them with data-driven insights and solutions.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Thought leadership and strategic guidance on infrastructure planning, balancing current needs with future growth projections to ensure scalability and cost-effectiveness.</li>\n<li>Promoting a data-driven approach to infrastructure decisions, influencing stakeholders across engineering, and supporting the use of data science insights for high-impact, aligned strategies.</li>\n<li>Implementing data-driven solutions to identify, predict, and mitigate infrastructure risks and failures, reducing downtime and improving system reliability and performance, directly impacting end-user satisfaction and operational continuity.</li>\n<li>Spearheading analyses to improve resource utilisation efficiency, identifying and eliminating inefficiencies across infrastructure usage, resulting in cost savings and optimised performance.</li>\n<li>Establishing data frameworks that empower support teams to troubleshoot and resolve product issues faster, decreasing response times and enhancing customer experience and support quality.</li>\n<li>Mentoring and managing a team of data scientists, instilling best practices in data science, engineering, and fostering a collaborative environment focused on innovative, scalable infrastructure solutions.</li>\n</ul>\n<p>We look for candidates with 10+ years of infrastructure data science, machine learning, advanced analytics experience in high-velocity, high-growth companies, as well as 5+ years of management experience hiring and developing teams.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_205a5f25-1f0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/7734812002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$228,600-$314,250 USD","x-skills-required":["infrastructure data science","machine learning","advanced analytics","data visualisation","data engineering","data modelling","big data technologies","leadership","communication"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:44:53.915Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"infrastructure data science, machine learning, advanced analytics, data visualisation, data engineering, data modelling, big data technologies, leadership, communication","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":228600,"maxValue":314250,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2a2686d2-290"},"title":"Staff Analytics Engineer","description":"<p>At Twilio, we&#39;re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences.</p>\n<p>Our Data Science and Analytics team seeks to empower R&amp;D to make data-backed decisions that accelerate innovation and improve product performance. You will work closely within our team and across Product &amp; Engineering to design and maintain a robust analytics data layer that enables trusted reporting on R&amp;D metrics.</p>\n<p>In this role, you&#39;ll:</p>\n<ul>\n<li>Design and implement a formal analytics data layer using AWS Glue, Presto, and LookML</li>\n<li>Collaborate within the Data Science &amp; Analytics team and across Product &amp; Engineering to define, document, and maintain alignment on metric definition and data lineage</li>\n<li>Develop and maintain automated data reconciliation and quality checks to proactively identify and resolve discrepancies, ensuring accuracy and consistency of critical reports and dashboards</li>\n<li>Lead investigations into complex data anomalies, conduct root cause analysis, and communicate findings and solutions effectively to both technical and non-technical audiences</li>\n<li>Mentor and guide members of the data science and analytics team, establishing and enforcing best practices around data modeling, testing, documentation, and code review</li>\n</ul>\n<p>Twilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply.</p>\n<p>If your career is just starting or hasn&#39;t followed a traditional path, don&#39;t let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2a2686d2-290","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Twilio","sameAs":"https://www.twilio.com/","logo":"https://logos.yubhub.co/twilio.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/twilio/jobs/7551660","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$155,520 - $194,400 (Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, Vermont or Washington D.C.)\n$164,640 - $205,800 (New York, New Jersey, Washington State, or California (outside of the San Francisco Bay area))\n$182,960 - $228,700 (San Francisco Bay area, California)","x-skills-required":["AWS Glue","Presto","LookML","SQL","data modeling","data pipelines","data reconciliation","data quality checks"],"x-skills-preferred":["Python","distributed computing technologies","Hive","Spark","dashboarding tools","Looker","Tableau"],"datePosted":"2026-04-18T15:43:20.940Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"AWS Glue, Presto, LookML, SQL, data modeling, data pipelines, data reconciliation, data quality checks, Python, distributed computing technologies, Hive, Spark, dashboarding tools, Looker, Tableau","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":155520,"maxValue":228700,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_49214f94-4ba"},"title":"Senior Manager, Infrastructure Data Science","description":"<p>We are looking for a Senior Manager, Infrastructure Data Science to shape the future of Databricks infrastructure through data science. You will tackle some of the most complex challenges related to capacity planning, performance optimisation, reliability engineering, infrastructure efficiency, and customer experience.</p>\n<p>As a Senior Manager, you will lead a team of data scientists and work directly in partnership with engineering leaders to empower them with data-driven insights and solutions. You will promote a data-driven approach to infrastructure decisions, influencing stakeholders across engineering, and support to leverage data science insights for high-impact, aligned strategies.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Thought leadership and strategic guidance on infrastructure planning, balancing current needs with future growth projections to ensure scalability and cost-effectiveness.</li>\n<li>Implement data-driven solutions to identify, predict, and mitigate infrastructure risks and failures, reducing downtime and improving system reliability and performance, directly impacting end-user satisfaction and operational continuity.</li>\n<li>Spearhead analyses to improve resource utilisation efficiency, identifying and eliminating inefficiencies across infrastructure usage, resulting in cost savings and optimised performance.</li>\n<li>Establish data frameworks that empower support teams to troubleshoot and resolve product issues faster, decreasing response times and enhancing customer experience and support quality.</li>\n<li>Mentor and manage a team of data scientists, instilling best practices in data science, engineering, and fostering a collaborative environment focused on innovative, scalable infrastructure solutions.</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>10+ years of infrastructure data science, machine learning, advanced analytics experience in high-velocity, high-growth companies.</li>\n<li>5+ years of management experience hiring and developing teams.</li>\n<li>Experience developing data science, analytics, and machine learning and AI products and capabilities in a cloud environment.</li>\n<li>Knowledge of statistics and rigorous analytical techniques.</li>\n<li>Experience with data visualisation tools, knowledge of data engineering, data modelling, and big data technologies.</li>\n<li>Leadership skills and experience to lead across functional and organisational lines.</li>\n<li>Strong communication skills to explain and evangelise analytics and data science to executives and the senior management team.</li>\n<li>Bias to action and passion for delivering high-quality data solutions.</li>\n<li>A passion for problem-solving and comfort with ambiguity.</li>\n<li>MS or Ph.D. in quantitative fields (Statistics, Math, CS or Engineering).</li>\n</ul>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipates utilising the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $228,600-$314,250 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_49214f94-4ba","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com/","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/7641390002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$228,600-$314,250 USD","x-skills-required":["infrastructure data science","machine learning","advanced analytics","cloud environment","statistics","data visualisation tools","data engineering","data modelling","big data technologies","leadership skills","communication skills","bias to action","passion for problem-solving","comfort with ambiguity","MS or Ph.D. in quantitative fields"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:43:09.520Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Mountain View, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"infrastructure data science, machine learning, advanced analytics, cloud environment, statistics, data visualisation tools, data engineering, data modelling, big data technologies, leadership skills, communication skills, bias to action, passion for problem-solving, comfort with ambiguity, MS or Ph.D. in quantitative fields","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":228600,"maxValue":314250,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9e36f934-a6f"},"title":"Elastic Consultant - Public Sector","description":"<p>We are seeking an Elastic Consultant to join our team in the Public Sector. As a consultant, you will work with customers to deliver and execute on professional services engagements. You will have the opportunity to work with a tremendous services, engineering, and sales team and wear many hats. This is a critical role, as consultants have an amazing chance to make an immediate impact on the success of Elastic and our customers.</p>\n<p>Strong customer advocacy, relationship building, and communications skills are essential for this role. You will need to be able to easily pivot from delivery to strategic engagements with customers. You will work with the wider Elastic organisation to support the customer&#39;s goals, their strategic requirements, and their journey with Elastic.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Strong customer advocacy, relationship building, and communications skills</li>\n<li>Ability to easily pivot from delivery to strategic engagements with customers</li>\n<li>Work with the wider Elastic organisation to support the customer&#39;s goals, their strategic requirements, and their journey with Elastic</li>\n<li>Ownership of the strategic roadmap with the customer including quarterly strategic sessions with senior and key stakeholders</li>\n<li>Solution design, development, and integration of Elastic products and APIs, platform architecture, and capacity planning in mission-critical environments</li>\n<li>Comfortable working remotely in a highly distributed team</li>\n<li>Development of demos and proof-of-concepts that highlight the value of the Elastic Stack</li>\n<li>Data modelling, query development and optimisation, cluster tuning and scaling with a focus on fast search and analytics at scale</li>\n<li>Solving our customers&#39; most challenging data problems</li>\n<li>Working closely with the Elastic engineering, product management, and support teams to identify feature enhancements, extensions</li>\n<li>Engaging with the Elastic Sales team to scope opportunities while assessing technical risks, questions, or concerns</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Hands-on experience and an understanding of Elasticsearch and/or Lucene</li>\n<li>Minimum of 2 years&#39; experience as a Software Engineer, System Administrator, or DevOps Engineer</li>\n<li>Minimum of 5 years&#39; experience working as a Consultant, working to deliver and execute on professional services engagements</li>\n<li>Currently holding current UK security clearance, or has previously held security clearance, or is willing to undergo security clearance</li>\n<li>Experience as a technical instructor or public speaker to large audiences on enterprise infrastructure software technology to engineers, developers, and other technical positions</li>\n<li>Excel at working directly with customers to gather, prioritise, plan and execute solutions to customer business requirements as it relates to our technologies</li>\n<li>Understanding and passion for open-source technology and knowledge and proficient in at least one programming language</li>\n<li>Hands-on experience with large distributed systems from an architecture and development perspective</li>\n<li>Knowledge of information retrieval and/or analytics domain</li>\n<li>The nature of the work that you will be doing will require a high percentage of work onsite with customers, and you should be expected to travel as a result of this requirement</li>\n<li>Understanding of Linux, Java, and databases</li>\n</ul>\n<p>Bonus Points:</p>\n<ul>\n<li>Deep understanding of Elasticsearch and Lucene, including Elastic Certified Engineer certification</li>\n<li>BS, MS, or PhD in Computer Science or related engineering discipline</li>\n<li>Strong knowledge of Java and Linux/Unix environment, software development, and/or experience with distributed systems</li>\n<li>Experience and interest in delivering and/or developing product training</li>\n<li>Experience contributing to an open-source project or documentation</li>\n</ul>\n<p>As a distributed company, diversity drives our identity. Whether you&#39;re looking to launch a new career or grow an existing one, Elastic is the type of company where you can balance great work with great life. Your age is only a number. It doesn&#39;t matter if you&#39;re just out of college or your children are; we need you for what you can do. We strive to have parity of benefits across regions and while regulations differ from place to place, we believe taking care of our people is the right thing to do.</p>\n<p>Competitive pay based on the work you do here and not your previous salary Health coverage for you and your family in many locations Ability to craft your calendar with flexible locations and schedules for many roles Generous number of vacation days each year Increase your impact - We match up to $2000 (or local currency equivalent) for financial donations and service Up to 40 hours each year to use toward volunteer projects you love Embracing parenthood with minimum of 16 weeks of parental leave</p>\n<p>Elastic is an equal opportunity employer and is committed to creating an inclusive culture that celebrates different perspectives, experiences, and backgrounds. Qualified applicants will receive consideration for employment without regard to race, ethnicity, color, religion, sex, pregnancy, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, disability status, or any other basis protected by federal, state or local law, ordinance or regulation.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9e36f934-a6f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Elastic","sameAs":"https://www.elastic.co/","logo":"https://logos.yubhub.co/elastic.co.png"},"x-apply-url":"https://job-boards.greenhouse.io/elastic/jobs/7451737","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Elasticsearch","Lucene","Java","Linux","Databases","Data modelling","Query development and optimisation","Cluster tuning and scaling","Fast search and analytics at scale","Information retrieval and/or analytics domain"],"x-skills-preferred":["Elastic Certified Engineer certification","BS, MS, or PhD in Computer Science or related engineering discipline","Software development","Distributed systems","Product training","Open-source project or documentation"],"datePosted":"2026-04-18T15:43:03.779Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United Kingdom"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Elasticsearch, Lucene, Java, Linux, Databases, Data modelling, Query development and optimisation, Cluster tuning and scaling, Fast search and analytics at scale, Information retrieval and/or analytics domain, Elastic Certified Engineer certification, BS, MS, or PhD in Computer Science or related engineering discipline, Software development, Distributed systems, Product training, Open-source project or documentation"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_da823275-f35"},"title":"Finance Reporting & Analytics Manager","description":"<p>We are seeking a Finance Reporting &amp; Analytics Manager to join our Finance Transformation team and focus on delivering and advancing our reporting and analytics capabilities within the Finance organization.</p>\n<p>The primary mission of the Finance Transformation team is to drive change and efficiency in Finance to support strategic objectives.</p>\n<p>As the Finance Analytics Manager, you will be responsible for partnering with Finance stakeholders to scope and prioritize reporting and analytic needs, designing and delivering dashboards, reports, and analyses.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Collecting, analyzing, and understanding business reporting requirements and translating them to data models and dashboards</li>\n</ul>\n<ul>\n<li>Designing and building underlying data models to support end-user reporting</li>\n</ul>\n<ul>\n<li>Evaluating and prioritizing new data analytics and insights project submissions, ensuring alignment with Finance org priorities</li>\n</ul>\n<ul>\n<li>Enhancing and evolving Finance reporting capabilities and driving the adoption of analytics tools and aligned reporting frameworks</li>\n</ul>\n<ul>\n<li>Creating detailed metadata documentation for data models and columns to improve data clarity and usability for end-users</li>\n</ul>\n<ul>\n<li>Monitoring and maintaining data quality, validation/audits, and governance for in-scope data domains</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>5+ years of relevant work experience in Business Analytics, Data Modeling/Analytics, Data Management, or Program Management</li>\n</ul>\n<ul>\n<li>Prior experience working across different Finance stakeholders, such as FP&amp;A, Revenue, Financial Reporting, etc.</li>\n</ul>\n<ul>\n<li>Strong understanding of Finance reporting principles and best practices</li>\n</ul>\n<ul>\n<li>Experience conducting large-scale data analysis to support business decision-making</li>\n</ul>\n<ul>\n<li>Experience working with enterprise-wide data, structured and unstructured, joining large and complex data sets</li>\n</ul>\n<ul>\n<li>Experience with SQL, dbt, Sigma Computing, and version control systems such as Git</li>\n</ul>\n<ul>\n<li>Demonstrable ability to be agile, proactive, and comfortable working in ambiguity and think creatively to come up with innovative solutions to complex problems</li>\n</ul>\n<ul>\n<li>Ability to translate functional (business) needs to analytics/technology</li>\n</ul>\n<ul>\n<li>Familiarity with data structures from common Finance systems, such as NetSuite, Zuora Revpro</li>\n</ul>\n<ul>\n<li>Self-motivated and pragmatic, with a get-it-done mentality</li>\n</ul>\n<p>As a distributed company, diversity drives our identity. Whether you&#39;re looking to launch a new career or grow an existing one, Elastic is the type of company where you can balance great work with great life. Your age is only a number. It doesn&#39;t matter if you&#39;re just out of college or your children are; we need you for what you can do.</p>\n<p>We strive to have parity of benefits across regions and while regulations differ from place to place, we believe taking care of our people is the right thing to do.</p>\n<p>Benefits include competitive pay, health coverage for you and your family, flexible locations and schedules, generous vacation days, and more.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_da823275-f35","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Elastic, the Search AI Company","sameAs":"https://www.elastic.co/","logo":"https://logos.yubhub.co/elastic.co.png"},"x-apply-url":"https://job-boards.greenhouse.io/elastic/jobs/7657471","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","dbt","Sigma Computing","version control systems","data modeling","data analysis","finance reporting","enterprise-wide data"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:46.432Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Rica"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"SQL, dbt, Sigma Computing, version control systems, data modeling, data analysis, finance reporting, enterprise-wide data"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_21efddae-814"},"title":"Responsable des programmes et des opérations commerciales (contrat de 16 mois)","description":"<p>Job Title: Responsable des programmes et des opérations commerciales (contrat de 16 mois)</p>\n<p>Location: Canada</p>\n<p>Department: Business Operations/Analysis</p>\n<p>Job Description:</p>\n<p>Airbnb was born in 2007 when two hosts welcomed three guests to their San Francisco home, and has since grown to over 5 million hosts who have welcomed over 2 billion guest arrivals in almost every country across the globe.</p>\n<p>Every day, hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way.</p>\n<p>The community that awaits you:</p>\n<p>Each day, Airbnb hosts offer unique stays and experiences that allow travelers to weave connections with communities in a more authentic way.</p>\n<p>Since Airbnb&#39;s inception, over 5 million hosts have welcomed over 1.5 billion visitors in nearly every country around the world.</p>\n<p>The organization&#39;s commercial leadership plays a central role in the company&#39;s growth and expansion efforts worldwide.</p>\n<p>Our colleagues work to commercialize new and existing Airbnb businesses, enabling Airbnb to become more than just a hosting platform.</p>\n<p>This includes recruiting and developing Airbnb&#39;s global offerings of stays, experiences, and services, as well as implementing Airbnb&#39;s global strategy in the market.</p>\n<p>By focusing on commercial strategy, quality procurement, and international expansion, the team opens the door to continuous growth and success for Airbnb in its next chapter.</p>\n<p>The regional operations team, part of the commercial operations division in Canada, ensures that Airbnb&#39;s activities are relevant at the local level and meet the needs of hosts and travelers worldwide by helping regions plan and implement growth strategies in areas such as supply, demand, and other sectors in a way that connects with the community.</p>\n<p>To support the development of our activities, we want to recruit the best and brightest talents to make strategic business decisions.</p>\n<p>To this end, the team is looking for a commercial operations manager to join our commercial operations team in Canada.</p>\n<p>Your Contribution:</p>\n<p>You will work closely with leaders and cross-functional teams and play a key role in developing our growth-focused programs and optimizingосредYour Contribution:</p>\n<p>You will work closely with leaders and cross-functional teams and play a key role in developing our growth-focused programs and optimizing the community.</p>\n<p>The successful candidate will report to the National Director in Canada.</p>\n<p>Typical Day:</p>\n<p>Align all functions to achieve objectives set with the Canadian leadership.</p>\n<p>Organize regular commercial operations meetings and initiate constructive conversations, as well as organize weekly project meetings with cross-functional teams (field and online operations, marketing, products, legal services) to ensure that key initiatives are on track in Canada.</p>\n<p>Study macroeconomic and microeconomic trends and present analysis, reports, and action plans to the Canadian national and regional Airbnb management team to enable them to make strategic business decisions.</p>\n<p>Support the National Director in Canada and other senior leaders on key strategic projects, as well as lead and implement ad-hoc operational and strategic projects for Canada.</p>\n<p>May need to coordinate activities with national, regional, and local teams in San Francisco.</p>\n<p>Be responsible for designing, executing, and evaluating end-to-end cross-functional initiatives.</p>\n<p>In concrete terms, this involves defining a precise problem statement, articulating the ideal solution to that problem, and identifying the main dependencies that hinder the solution, then working with cross-functional stakeholders to implement it.</p>\n<p>These initiatives will cover areas such as demand, supply, and regulatory aspects of the business.</p>\n<p>Your Expertise:</p>\n<p>The ideal candidate has at least ten years of professional experience, preferably with a combination of exposure to data modeling, analysis, and project management in a dynamic entrepreneurial environment.</p>\n<p>Excellent communication skills, both written and verbal, combined with the ability to deliver concise, clear, and compelling presentations at all levels of administration.</p>\n<p>Proven experience in influencing and mobilizing various cross-functional stakeholders towards a common mission or objective using qualitative and quantitative information.</p>\n<p>Ability to dive into details to drive execution, as well as take a step back to contextualize recommendations and initiatives within the company&#39;s overall global strategy.</p>\n<p>Ability to perform multiple tasks by prioritizing work and coordinating necessary support between various functions to achieve project goals.</p>\n<p>Impeccable organizational skills, great attention to detail, and the ability to document processes and decisions made.</p>\n<p>Experience in implementing and developing initiatives within dispersed teams.</p>\n<p>Experience working cross-functionally with multiple departments.</p>\n<p>Practical approach to moving things forward quickly.</p>\n<p>Ability to manipulate large datasets in Excel is essential. SQL mastery is highly preferred.</p>\n<p>Your Location:</p>\n<p>This position is located in Canada (telecommuting possible).</p>\n<p>The position may involve working occasionally in an Airbnb office or participating in events outside the office (as agreed with your manager).</p>\n<p>Telecommuting is possible, but you must live in an area where there is a registered Airbnb Canada Inc. entity, which is British Columbia, Alberta, Ontario, and Quebec.</p>\n<p>Please contact us if you live in a province or territory that is not part of this list.</p>\n<p>If you already work for another Airbnb entity, your recruiter will inform you in which provinces and territories you can work.</p>\n<p>Our commitment to inclusion and belonging:</p>\n<p>Airbnb is committed to working with people from the most diverse backgrounds possible.</p>\n<p>We believe that diversity of ideas encourages innovation and engagement.</p>\n<p>This approach helps us attract creative talent and develop the best products and services, as well as the best possible solutions.</p>\n<p>We encourage anyone qualified to apply.</p>\n<p>We also strive to provide an inclusive hiring and interview process for people with disabilities.</p>\n<p>If you have a disability and need reasonable accommodations to submit your application, please contact us at reasonableaccommodations@airbnb.com.</p>\n<p>Please indicate your full name, the position you are applying for, and the accommodations you need to help you through the recruitment process.</p>\n<p>We ask that you only contact us about accommodations, not about the status of your application.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_21efddae-814","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Airbnb","sameAs":"https://www.airbnb.com/","logo":"https://logos.yubhub.co/airbnb.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/airbnb/jobs/7743374","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data modeling","analysis","project management","communication","influencing","mobilizing","qualitative information","quantitative information","Excel","SQL"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:00.694Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Canada"}},"employmentType":"FULL_TIME","occupationalCategory":"Operations","industry":"Technology","skills":"data modeling, analysis, project management, communication, influencing, mobilizing, qualitative information, quantitative information, Excel, SQL"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e7491b84-e4f"},"title":"Backend Engineer, Knowledge Graph (Rust)","description":"<p>As an Intermediate Backend Engineer on the GitLab Knowledge Graph team, you&#39;ll help build and operate a graph data service that supports GitLab Duo agents, analytics, and architecture-level features across GitLab.com, Dedicated, and Self-Managed deployments.</p>\n<p>You&#39;ll join a small, Rust-first team that values clear ownership, thoughtful system design, and rigorous thinking about data and reliability. The Knowledge Graph service is a Rust backend that builds a property graph from GitLab&#39;s software development lifecycle (SDLC) and code data. It uses ClickHouse, NATS JetStream, and the Data Insights Platform. It exposes secure graph queries and MCP tools used by AI agents and product features.</p>\n<p>In this role, you&#39;ll deliver features and improvements in well-scoped areas, learn the broader architecture, and contribute to reliability, observability, and operational readiness. In your first year, you&#39;ll take clear ownership of specific components or features (for example, parts of the SDLC indexing pipeline or query paths). You&#39;ll help reduce single points of failure with better tests and runbooks, and you&#39;ll help the team ship analytical services that are easier to maintain and evolve over time.</p>\n<p>Key responsibilities include:</p>\n<p>Implementing and iterating on backend features in the Rust-based Knowledge Graph service, including changes to the query engine, SDLC and code indexing flows, and API endpoints (including MCP endpoints) under guidance from senior and staff engineers.</p>\n<p>Helping maintain integrations between Knowledge Graph and the rest of the GitLab platform, working in areas that touch GitLab Rails, the Data Insights Platform (Siphon, NATS, ClickHouse), and GitLab Duo Agent Platform.</p>\n<p>Contributing to system design discussions by proposing options, raising questions, and documenting decisions, with a focus on reliability, scalability, and maintainability for analytical graph workloads.</p>\n<p>Improving the operational maturity of the service by adding or enhancing metrics, logging, runbooks, alerts, and small readiness tasks, and by participating in on-call rotation as appropriate for your level and experience.</p>\n<p>Collaborating asynchronously with product, data, infrastructure, security, and AI counterparts to clarify requirements, align on scope, and ship features safely for customers and sustainably for the team.</p>\n<p>Using AI-assisted development workflows responsibly (for example, using Knowledge Graph-backed agents and internal Duo tooling), and sharing what works with the team while keeping a strong focus on code quality and correctness.</p>\n<p>Participating in code reviews, knowledge-sharing sessions, and pairing to both learn from others and help maintain consistent standards across the codebase.</p>\n<p>Contribute across the stack when needed, including occasional Ruby work for Rails integration and authorization paths, or small frontend changes related to Knowledge Graph features (for example, Software Architecture Map UI plumbing).</p>\n<p>What you&#39;ll bring:</p>\n<p>Professional experience building and maintaining backend systems in production, with an understanding of reliability, maintainability, and how to support services over time (incident responses, and follow-ups, etc).</p>\n<p>Proficiency in at least one modern backend language and strong interest in Rust, with either prior Rust experience or clear evidence you can ramp quickly and deliver in a Rust-first, performance-sensitive codebase.</p>\n<p>Some exposure to distributed data or analytics systems (for example, OLAP databases, Kafka- or NATS-style messaging, or change data capture (CDC) pipelines), or strong motivation to develop those skills in this role.</p>\n<p>Interest in graph data modeling and query patterns (property graphs, multi-step (n-hop) traversals, aggregations), and willingness to learn the tools and concepts used in Knowledge Graph over time.</p>\n<p>Practical experience (or strong interest) using AI tools in day-to-day development, along with a thoughtful approach to validating outputs and integrating AI into your workflow.</p>\n<p>A language-agnostic mindset and evidence that you can pick up new languages and frameworks as needed (for example, Ruby, Go, or TypeScript/Vue where the work touches adjacent systems).</p>\n<p>Solid fundamentals in system design for your level, including the ability to reason about trade-offs, ask good questions, and align your implementation work with documented architectural decisions.</p>\n<p>Comfort working in a low-process, high-ownership environment where you take responsibility for your work, communicate progress clearly, and help refine problem statements with your teammates.</p>\n<p>Strong written communication and comfort collaborating asynchronously across time zones in an all-remote team.</p>\n<p>About the team:</p>\n<p>We sit within the Data Engineering organization. We&#39;re a small group of senior engineers and we work closely with partners across AI (Duo Agent Platform), analytics, infrastructure and delivery, and security because our work spans many parts of the platform. We collaborate asynchronously and optimize for strong ownership rather than a feature factory model. We each build a meaningful understanding of the system and help evolve it over time. A key challenge for us right now is scaling sustainably. That includes hardening multi-tenant behavior, maturing observability and readiness, and keeping the system healthy and maintainable as usage grows and team members take time off. At the same time, we&#39;re bringing Knowledge Graph to general availability (GA).</p>\n<p>How GitLab Supports Full-Time Employees:</p>\n<p>Benefits to support your health, finances, and well-being Flexible Paid Time Off Team Member Resource Groups Equity Compensation &amp; Employee Stock Purchase Plan Growth and Development Fund Parental leave Home office support</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e7491b84-e4f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"GitLab","sameAs":"https://about.gitlab.com/","logo":"https://logos.yubhub.co/about.gitlab.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/gitlab/jobs/8481958002","x-work-arrangement":"remote","x-experience-level":"intermediate","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Rust","backend systems","distributed data","analytics systems","graph data modeling","query patterns","AI tools","system design","low-process","high-ownership environment"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:41:48.392Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, India"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Rust, backend systems, distributed data, analytics systems, graph data modeling, query patterns, AI tools, system design, low-process, high-ownership environment"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0ccde3b0-9f8"},"title":"Director Product Management, Banking + AP","description":"<p>Join Brex, the intelligent finance platform that empowers companies to spend smarter and move faster in more than 200 markets.</p>\n<p>As Director of Product Management, Banking + AP, you will define and drive the strategy for one of Brex&#39;s most critical domains: the Operating Account platform , the combination of Banking and Accounts Payable that makes Brex the place customers run payments from day one.</p>\n<p>Brex&#39;s banking product is used by tens of thousands of businesses to store $10B+ in total deposits, and our AP platform moves over $1B across hundreds of thousands of transactions each month.</p>\n<p>This role will own the end-to-end product vision and execution across customer accounts, money movement, treasury capabilities, liquidity management, accounts payable workflows, and regulatory-compliant banking services.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Define and drive Operating Account strategy - Own the multi-year product vision and roadmap for Brex&#39;s Operating Account platform spanning Banking and Accounts Payable , aligned to company-level priorities.</li>\n<li>Identify and prioritize high-leverage opportunities that drive AP attach, deepen operating behavior, and grow durable DDA balances through payment activity rather than balance-seeking alone.</li>\n<li>Identify customer challenges across increasing coordination and control complexity, from admin-initiated payments and centralized AP through decentralized, multi-stakeholder procurement workflows.</li>\n<li>Balance innovation with regulatory rigor, ensuring long-term scalability and resilience.</li>\n</ul>\n<p>Lead complex, regulated systems across Banking and AP:</p>\n<ul>\n<li>Oversee core banking and AP domains including accounts, ledger systems, payment rails (ACH, wires, RTP, international), bill pay, vendor management, treasury workflows, and embedded financial services.</li>\n<li>Drive the integration of banking and AP capabilities into a unified operating experience, ensuring that money movement, vendor payments, and account management work seamlessly together.</li>\n<li>Engage deeply with Engineering and Risk in discussions around APIs, ledger design, reconciliation, data models, and risk controls, to design reliable and scalable systems.</li>\n</ul>\n<p>Build and lead a high-performing PM team:</p>\n<ul>\n<li>Manage and develop a team of Product Managers spanning banking and AP, fostering ownership, clarity, and high standards.</li>\n<li>Elevate product craft across the organization, being at the forefront of how the role is changing as we incorporate agents - Establish strong operating rhythms for planning, prioritization, and cross-functional alignment.</li>\n</ul>\n<p>Deliver measurable business impact:</p>\n<ul>\n<li>Define success metrics tied to company outcomes such as operating account attach rate, DDA balance growth, AP transaction volume, revenue expansion, margin improvement, customer retention, risk loss rates, and operational efficiency.</li>\n<li>Drive structured decision-making using data, experimentation, and financial modeling , with a sharp focus on the relationship between AP behavior and balance durability.</li>\n<li>Hold teams accountable to clear impact targets and measurable results.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>10+ years of product management experience, including leading and developing high-performing PM teams and building strong product cultures.</li>\n<li>Proven track record of setting and executing multi-quarter or multi-year strategies that delivered measurable business impact (growth, retention, revenue expansion, cost efficiency, or adoption).</li>\n<li>Strong business and systems thinking , able to translate complex operational or infrastructure challenges into simple customer experiences that drive adoption and long-term platform stickiness.</li>\n<li>Data-driven decision maker with experience defining success metrics, modeling business impact, and using SQL or similar tools to inform priorities and tradeoffs.</li>\n<li>Exceptional communicator who creates clarity in complex environments , distilling strategy, tradeoffs, and decisions into narratives that align executives and mobilize teams.</li>\n<li>Experience building trusted, mission-critical products where reliability, accuracy, and customer confidence are non-negotiable.</li>\n<li>High standards for product rigor and execution quality , consistently raising the bar for clarity of thinking, prioritization, and customer impact.</li>\n</ul>\n<p>Bonus Points:</p>\n<ul>\n<li>Experience building or scaling accounts payable, payments, or money movement workflows, especially in B2B environments.</li>\n<li>Experience with financial infrastructure, fintech platforms, or regulated products.</li>\n<li>Familiarity with distributed systems, APIs, ledgers, payment rails, or financial data models.</li>\n<li>Experience improving or automating procurement, bill pay, vendor management, or operational finance workflows.</li>\n<li>Track record of driving deposit growth, balance-based revenue, or usage-driven platform adoption.</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $340,000 - $425,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0ccde3b0-9f8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8439929002","x-work-arrangement":"hybrid","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":"$340,000 - $425,000","x-skills-required":["Product Management","Banking","Accounts Payable","Financial Infrastructure","Fintech Platforms","Regulated Products","Distributed Systems","APIs","Ledgers","Payment Rails","Financial Data Models","SQL","Data Modeling","Risk Controls","Vendor Management","Treasury Workflows","Embedded Financial Services"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:41:18.653Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Vancouver, British Columbia, Canada"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"Product Management, Banking, Accounts Payable, Financial Infrastructure, Fintech Platforms, Regulated Products, Distributed Systems, APIs, Ledgers, Payment Rails, Financial Data Models, SQL, Data Modeling, Risk Controls, Vendor Management, Treasury Workflows, Embedded Financial Services","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":340000,"maxValue":425000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0efa9467-f65"},"title":"Senior Solutions Architect","description":"<p>We&#39;re seeking a Senior Solutions Architect to join our Post Sales team and partner with enterprise customers to design and implement complex, AI-enabled Airtable solutions.</p>\n<p>In this high-impact role, you&#39;ll lead the architecture and delivery of enterprise implementations, translating business workflows into scalable, AI-powered systems that accelerate time-to-value and drive long-term adoption. You&#39;ll work closely with Engagement Managers, Account teams, and Product to shape the future of how organisations leverage Airtable AI to transform their operations.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Architect and deliver paid SOW enterprise Airtable implementations for strategic customers</li>\n<li>Lead scoping and design of Airtable AI solutions during professional services engagements</li>\n<li>Develop repeatable AI solution patterns, including AI workflows, automations, and structured data pipelines</li>\n<li>Establish best practices for data modelling, governance, and AI-enabled workflow design</li>\n<li>Partner with Engagement Managers to scope services engagements and define implementation plans</li>\n<li>Produce solution architecture diagrams, data models, and workflow documentation for enterprise customers</li>\n<li>Drive adoption of Airtable AI capabilities across multiple enterprise implementations</li>\n<li>Provide technical guidance and troubleshoot architectural challenges during implementations</li>\n<li>Collaborate with Product and internal teams to relay AI feature feedback and influence the roadmap</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>5+ years of solution architecture, solution engineering, or technical consulting experience for enterprise SaaS platforms</li>\n<li>Strong understanding of data modelling, database design, and governance</li>\n<li>Experience designing workflow automation and operational systems</li>\n<li>Familiarity with AI-enabled SaaS capabilities (LLMs, AI automation, AI-assisted workflows, or AI copilots)</li>\n<li>Ability to translate complex business processes into scalable technical architectures</li>\n<li>Strong stakeholder communication skills across technical and executive audiences</li>\n<li>Experience scoping and delivering enterprise implementations</li>\n<li>Proficiency in process mapping and architecture documentation tools (e.g., Lucidchart, Visio)</li>\n<li>Experience designing AI-driven workflow solutions (LLMs, prompt design, AI agents, AI automation)</li>\n<li>Experience implementing platforms like Airtable, Notion, ServiceNow, Salesforce, Workato, or similar workflow/data systems</li>\n<li>Familiarity with APIs, integrations, and automation frameworks</li>\n<li>Experience with enterprise data governance and security models</li>\n<li>Experience in Professional Services or consulting organisations</li>\n<li>Technical familiarity with scripting or low-code/no-code platforms</li>\n<li>Experience developing repeatable architecture patterns or implementation frameworks</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0efa9467-f65","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Airtable","sameAs":"https://airtable.com/","logo":"https://logos.yubhub.co/airtable.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/airtable/jobs/8487502002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Solution Architecture","Data Modelling","Database Design","Governance","AI-Enabled SaaS Capabilities","LLMs","AI Automation","AI-Assisted Workflows","AI Copilots","Process Mapping","Architecture Documentation Tools","APIs","Integrations","Automation Frameworks","Enterprise Data Governance","Security Models","Professional Services","Scripting","Low-Code/No-Code Platforms"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:41:11.308Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA; New York, NY; Remote - US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Solution Architecture, Data Modelling, Database Design, Governance, AI-Enabled SaaS Capabilities, LLMs, AI Automation, AI-Assisted Workflows, AI Copilots, Process Mapping, Architecture Documentation Tools, APIs, Integrations, Automation Frameworks, Enterprise Data Governance, Security Models, Professional Services, Scripting, Low-Code/No-Code Platforms"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a95e0984-1cb"},"title":"Senior Business Solutions Engineer, BizTech","description":"<p><strong>Job Title</strong></p>\n<p>Senior Business Solutions Engineer, BizTech</p>\n<p><strong>Job Description</strong></p>\n<p>We are seeking a world-class Senior Business Systems Engineer to join our dynamic team. As a Senior Business Solutions Engineer, you will be responsible for providing quick solutions utilizing internal tools and agentic AI to support and assist teams in enhancing their day-to-day workflows. You will oversee the end-to-end delivery of system and tool changes as required, and provide support to those solutions.</p>\n<p>Your expertise will help identify current pain points for stakeholders, promote industry best practices, and maintain data integrity and accuracy within our platforms. You will also play a crucial role in mentoring junior team members, driving strategy, and delivering results with excellence.</p>\n<p>A high degree of focus on practical AI and understanding the current state of industry as AI technology changes is critical.</p>\n<p><strong>Responsibilities</strong></p>\n<p><strong>Product Roadmap Ownership</strong></p>\n<p>Lead the design, development, implementation, and optimization of custom solutions to improve internal teams&#39; processes and workflows, with a focus on Agentic AI solutions utilizing Airbnb-approved internal tools.</p>\n<p><strong>Stakeholder Management</strong></p>\n<p>Identify and address current pain points, socialize industry best practices, and prioritize project pipelines in collaboration with stakeholders.</p>\n<p><strong>Automation</strong></p>\n<p>Identify and drive automation initiatives to reduce manual processes and improve operational efficiency across core business functions.</p>\n<p><strong>Vendor Management</strong></p>\n<p>Partner with our vendors to drive and own the product feature and bug lifecycle from requirements gathering to launch.</p>\n<p><strong>Industry Trends</strong></p>\n<p>Stay updated with the latest platform features, best practices, and industry trends to proactively identify opportunities for enhancements.</p>\n<p><strong>Business Process Improvement</strong></p>\n<p>Craft and suggest future-state business processes, and lead the implementation of these suggestions.</p>\n<p><strong>Expertise and Mentorship</strong></p>\n<p>Act as a company-wide expert in relevant functional tools and applications. Mentor junior team members and provide guidance on development and delivery.</p>\n<p><strong>Architecture and Design Reviews</strong></p>\n<p>Collaborate on architecture and design reviews, offering necessary guidance to ensure development guidelines are followed.</p>\n<p><strong>Requirements</strong></p>\n<p><strong>Experience</strong></p>\n<p>8+ years of professional experience with proven enterprise-grade systems implementations, IT consulting, or similar field leveraging advanced platform integrations Minimum of 2 years of experience demonstrably using AI in workflows and to support customers Experience researching and implementing custom applications from the Marketplace from research, to design and implementation Advanced degree (e.g., MS/MBA) in Business, Computer Science, or related fields preferred Demonstrated experience designing and implementing custom business solutions and integrations using enterprise platforms Deep knowledge of data modeling, business intelligence tools (e.g., Tableau), and enterprise reporting best practices Skilled at stakeholder engagement and cross-functional collaboration in large global organizations Adept at leading complex projects, simplifying technical concepts, and promoting operational excellence Excellent written and verbal communication skills, especially in critiquing technical designs and leading workshops, and engaging with senior leadership Exemplifies strong leadership fundamentals, guides junior team members, and drives strategy with a focus on long-term scalability Proficient in designing and implementing data management strategies to ensure data integrity Identifies opportunities for product and process improvements and can lead and implement said improvements, obtaining necessary buy-in from stakeholders and leadership</p>\n<p><strong>Preferred Qualifications</strong></p>\n<p>Advanced degree (e.g., MS/MBA) in Business, Computer Science, or related fields</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a95e0984-1cb","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Airbnb","sameAs":"https://www.airbnb.com/","logo":"https://logos.yubhub.co/airbnb.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/airbnb/jobs/7716311","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Enterprise-grade systems implementations","IT consulting","Agentic AI","Data modeling","Business intelligence tools","Enterprise reporting best practices","Stakeholder engagement","Cross-functional collaboration","Leadership","Data management strategies"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:40:58.152Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bangalore, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Enterprise-grade systems implementations, IT consulting, Agentic AI, Data modeling, Business intelligence tools, Enterprise reporting best practices, Stakeholder engagement, Cross-functional collaboration, Leadership, Data management strategies"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7dee17b8-923"},"title":"Data Science Engineer, Capacity & Efficiency","description":"<p>As a Data Science Engineer, Capacity &amp; Efficiency, you will play a critical role in Anthropic&#39;s mission of building safe and beneficial AI by ensuring we understand, optimize, and strategically manage our cloud infrastructure spend.</p>\n<p>You will work closely with Compute Finance, Infrastructure Engineers, and Product to translate raw cloud billing data into actionable efficiency insights and influence capacity planning &amp; allocation. You will help build deep visibility into our infrastructure spend, forecast capacity needs, attribute costs accurately across teams and workloads, model resource demand curves, and help identify efficiency opportunities across our fleet.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Build and maintain cloud cost attribution models that accurately allocate infrastructure spend (compute, accelerators, storage, networking, data transfer) across teams, products, and workloads, providing clear visibility into who is spending what and why.</li>\n<li>Partner with infrastructure, finance, and procurement stakeholders to analyze utilization patterns, identify inefficiencies, and drive optimization initiatives that improve the cost-effectiveness of our non-accelerator cloud resources.</li>\n<li>Develop forecasting models for non-accelerator infrastructure demand, incorporating business growth projections, product roadmaps, and historical spend trends to enable proactive capacity planning and budget accuracy.</li>\n<li>Define and track unit cost metrics (e.g., cost per request, cost per GB stored, cost per pipeline run) and identify opportunities to reduce them, influencing infrastructure and engineering roadmaps with data-driven recommendations.</li>\n<li>Develop unit cost economics for various workloads and applications, and using the metrics to drive efficiency efforts across product and infrastructure teams.</li>\n<li>Build a cost-aware culture across the organization by creating self-serve dashboards, automated reporting, and accessible datasets that give engineering and finance teams clear visibility into cloud spend and efficiency metrics.</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>6+ years of experience in data science, analytics, or FinOps roles, with a focus on cloud infrastructure cost analysis, capacity planning, or efficiency optimization.</li>\n<li>Experience building spend forecasting models and large-scale cost attribution systems.</li>\n<li>Deep knowledge of cloud billing systems, cost allocation methodologies, and spend optimization levers (e.g., reserved instances, committed use discounts, rightsizing, spot/preemptible usage).</li>\n<li>Expertise in Python, SQL, forecasting, data modeling and data visualization tools.</li>\n<li>Strong communication and presentation skills.</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<ul>\n<li>Competitive compensation and benefits package.</li>\n<li>Optional equity donation matching.</li>\n<li>Generous vacation and parental leave.</li>\n<li>Flexible working hours.</li>\n<li>Lovely office space in San Francisco.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7dee17b8-923","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.com/","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5125881008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$275,000-$370,000 USD","x-skills-required":["cloud infrastructure cost analysis","capacity planning","efficiency optimization","Python","SQL","forecasting","data modeling","data visualization"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:37:33.429Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City, NY; San Francisco, CA | New York City, NY; Seattle, WA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"cloud infrastructure cost analysis, capacity planning, efficiency optimization, Python, SQL, forecasting, data modeling, data visualization","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":275000,"maxValue":370000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2f60560e-b87"},"title":"Applied Math Tutor","description":"<p>As an AI Tutor - Applied Math Specialist, you&#39;ll play a key role in advancing xAI&#39;s mission by enhancing our AI technologies through high-quality inputs, labels, and annotations using specialized software.</p>\n<p>You&#39;ll collaborate with our technical team to train models on human interactions, problem-solving, and discussions; refine annotation tools; and select/create challenging applied math problems to boost performance.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Use proprietary software applications to provide input/labels on defined projects.</li>\n<li>Support and ensure the delivery of high-quality curated data.</li>\n<li>Play a pivotal role in supporting and contributing to the training of new tasks, working closely with the technical staff to ensure the successful development and implementation of cutting-edge initiatives/technologies.</li>\n<li>Interact with the technical staff to help improve the design of efficient annotation tools.</li>\n<li>Choose problems from applied math domains that align with your expertise, focusing on areas such as probability, statistics, numerical analysis, optimization, operations research, dynamical systems, data modeling, and related fields where you can confidently provide detailed solutions and evaluate model responses.</li>\n</ul>\n<p>Basic Qualifications:</p>\n<ul>\n<li>Must have either (a) a Master’s or PhD in Mathematics with a specialization in a subdomain of applied mathematics (there is a separate position for pure math) or (b) a medal in an International Math Olympiad (IMO) or similar level math competition.</li>\n<li>Proficiency in reading and writing, both in informal and professional English.</li>\n<li>Strong ability to navigate various information resources and databases.</li>\n<li>Outstanding communication, interpersonal, analytical, and organizational capabilities.</li>\n<li>Solid reading comprehension skills combined with capacity to exercise autonomous judgment even when presented with limited data/material.</li>\n<li>A strong passion for and commitment to technological advancements and innovation.</li>\n</ul>\n<p>Preferred Skills and Experience:</p>\n<ul>\n<li>Math PhD with a specialization in a subdomain of applied mathematics.</li>\n<li>Peer-reviewed applied math publications in well-regarded journals.</li>\n<li>Previous AI Tutoring experience and/or experience teaching college courses.</li>\n</ul>\n<p>Location and Other Expectations:</p>\n<ul>\n<li>Tutor roles may be offered as full-time, part-time, or contractor positions, depending on role needs and candidate fit.</li>\n<li>For contractor positions, hours will vary widely based on project scope and contractor availability, with no fixed commitments required. On average most projects may involve at least 10 hours per week to achieve deliverables effectively though this is not a fixed commitment and depends on the scope of work. Contractors have full flexibility to set their own hours and determine the exact amount of time needed to complete deliverables.</li>\n<li>Tutor roles may be performed remotely from any location worldwide, subject to legal eligibility, time-zone compatibility, and role specific needs.</li>\n<li>For US based candidates, please note we are unable to hire in the states of Wyoming and Illinois at this time.</li>\n<li>We are unable to provide visa sponsorship.</li>\n<li>For those who will be working from a personal device, your computer must be a Chromebook, Mac with MacOS 11.0 or later, or Windows 10 or later.</li>\n</ul>\n<p>Compensation and Benefits:</p>\n<p>US based candidates: $45/hour - $75/hour depending on factors including relevant experience, skills, education, geographic location, and qualifications. International candidates: Information will be provided to you during the recruitment process.</p>\n<p>Benefits vary based on employment type, location and jurisdiction. Benefits for eligible U.S. based positions include health insurance, 401(k) plan, and paid sick leave. Specific details and role specific information will be provided to you during the interview process.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2f60560e-b87","directApply":true,"hiringOrganization":{"@type":"Organization","name":"xAI","sameAs":"https://www.xai.com/","logo":"https://logos.yubhub.co/xai.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/xai/jobs/4925878007","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time|part-time|contract","x-salary-range":"$45/hour - $75/hour","x-skills-required":["applied mathematics","probability","statistics","numerical analysis","optimization","operations research","dynamical systems","data modeling"],"x-skills-preferred":["math PhD","peer-reviewed publications","AI tutoring experience"],"datePosted":"2026-04-18T15:32:44.335Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"applied mathematics, probability, statistics, numerical analysis, optimization, operations research, dynamical systems, data modeling, math PhD, peer-reviewed publications, AI tutoring experience"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7e1c09f5-e8b"},"title":"Director of Sales Engineering and Solution Design","description":"<p>We&#39;re seeking a seasoned and technically sophisticated Director of Sales Engineering &amp; Solution Design to join our commercial leadership team.</p>\n<p>As healthcare organisations accelerate their adoption of AI and data-driven approaches to care, the way they evaluate and adopt platforms like Zus is changing. This role sits at the intersection of technical strategy, customer partnership, and market shaping.</p>\n<p>You will define how prospects experience Zus - from first technical conversation through product deployment - and ensure every engagement demonstrates clear, measurable value.</p>\n<p><strong>Responsibilities</strong></p>\n<p><strong>Team Leadership</strong></p>\n<ul>\n<li>Lead, mentor, and grow a team of Sales Engineers, setting clear expectations, coaching on technical depth and customer engagement</li>\n<li>Establish team operating playbooks and a culture of continuous learning across AI, interoperability, and solution design</li>\n<li>Recruit and onboard top-tier technical and commercial talent as the team scales with Zus’s growth</li>\n</ul>\n<p><strong>Solution Design &amp; Technical Architecture</strong></p>\n<ul>\n<li>Personally drive solution design and technical architecture for complex, high-impact prospects in a healthcare setting</li>\n<li>Create detailed architectural diagrams (data flow, integration topology, system context) that clearly communicate how Zus fits within a prospect’s technical ecosystem</li>\n<li>Design and validate customer integrations with Zus APIs, healthcare data models, and interoperability standards</li>\n<li>Act as the senior technical escalation point for complex integrations, edge cases, and customer challenges</li>\n<li>Develop and maintain a library of reference architectures, reusable solution patterns, and integration blueprints that accelerate customer delivery</li>\n<li>Design solutions that incorporate AI and machine learning capabilities - including clinical data enrichment, intelligent matching, predictive insights, and workflow automation</li>\n</ul>\n<p><strong>Pre-Sales Execution &amp; Technical Storytelling</strong></p>\n<ul>\n<li>Own and execute the technical pre-sales motion, from discovery through proof-of-concept and handoff to post-sale teams</li>\n<li>Deliver and oversee technical demos and executive-level presentations to audiences ranging from engineers to C-suite stakeholders</li>\n<li>Translate sometimes ambiguous customer requirements into crisp, feasible technical proposals with clear success criteria</li>\n</ul>\n<p><strong>Cross-Functional Collaboration</strong></p>\n<ul>\n<li>Partner with Product and Engineering to translate customer feedback into product improvements and roadmap input</li>\n<li>Ensure a smooth, well-documented handoff from pre-sales to Customer Success and Implementation</li>\n<li>Collaborate cross-functionally to improve go-to-market strategy, technical enablement, and sales effectiveness</li>\n<li>Balance leadership with execution,jumping in wherever needed to help the team and the company move fast</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>6+ years of experience in Sales Engineering, Solutions Architecture, or Technical Implementation, including experience in healthtech or healthcare data platforms</li>\n<li>Prior experience managing or mentoring a small technical team, with a hands-on leadership style</li>\n<li>Strong hands-on experience designing and implementing solutions using RESTful APIs, healthcare interoperability standards (FHIR, C-CDA, HL7), and EMR data</li>\n<li>Proficiency in SQL and comfort working directly with complex healthcare data models</li>\n<li>Demonstrated ability to design end-to-end solutions that balance technical feasibility, customer needs, and business outcomes</li>\n<li>Excellent communication skills and the ability to clearly explain complex concepts to technical and non-technical stakeholders</li>\n<li>A customer-first mindset and a passion for building trusted technical partnerships</li>\n<li>Comfort operating in ambiguity and building structure in an early-stage environment</li>\n<li>Calm, adaptable, and effective when navigating complex or high-pressure situations</li>\n</ul>\n<p><strong>Preferred &amp; Differentiating Experience</strong></p>\n<ul>\n<li>Familiarity with AI/ML applications in healthcare including: NLP for clinical data, predictive analytics, intelligent document processing, or LLM-powered workflows</li>\n<li>Experience building or presenting AI-augmented product demos and proof-of-concepts</li>\n<li>Background with cloud-native architectures (AWS, GCP, or Azure), containerization, and modern CI/CD pipelines</li>\n<li>Exposure to value-based care models, population health platforms, or payer-provider data exchange</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive compensation that reflects the value you bring to the team a combination of cash and equity</li>\n<li>Robust benefits that include health insurance, wellness benefits, 401k with a match, unlimited PTO</li>\n<li>Opportunity to work alongside a passionate team that is determined to help change the world (and have fun doing it)</li>\n</ul>\n<p><strong>How to Apply</strong></p>\n<ul>\n<li>Please note that research shows that candidates from underrepresented backgrounds often don’t apply unless they meet 100% of the job criteria.</li>\n<li>While we have worked to consolidate the minimum qualifications for each role, we aren’t looking for someone who checks each box on a page; we’re looking for active learners and people who care about disrupting the current healthcare system with their unique experiences.</li>\n<li>We do not conduct interviews by text nor will we send you a job offer unless you’ve interviewed with multiple people, including the Director of People &amp; Talent, over video interviews.</li>\n<li>Job scams do exist so please be careful with your personal information.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7e1c09f5-e8b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Zus","sameAs":"https://zus.com/","logo":"https://logos.yubhub.co/zus.com.png"},"x-apply-url":"https://jobs.lever.co/zushealth/697c598a-ef7f-4719-b816-fbb037dd9aef","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"USD 150,000-200,000 per year","x-skills-required":["RESTful APIs","healthcare interoperability standards (FHIR, C-CDA, HL7)","EMR data","SQL","complex healthcare data models","AI/ML applications in healthcare","NLP for clinical data","predictive analytics","intelligent document processing","LLM-powered workflows"],"x-skills-preferred":["cloud-native architectures (AWS, GCP, or Azure)","containerization","modern CI/CD pipelines","value-based care models","population health platforms","payer-provider data exchange"],"datePosted":"2026-04-17T13:12:55.685Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"RESTful APIs, healthcare interoperability standards (FHIR, C-CDA, HL7), EMR data, SQL, complex healthcare data models, AI/ML applications in healthcare, NLP for clinical data, predictive analytics, intelligent document processing, LLM-powered workflows, cloud-native architectures (AWS, GCP, or Azure), containerization, modern CI/CD pipelines, value-based care models, population health platforms, payer-provider data exchange","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":150000,"maxValue":200000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6acd8036-5ec"},"title":"Platform Engineer (Databases & Storage)","description":"<p>We are looking for a Staff Platform Engineer to own the database and storage foundation of World Labs. This is a high-impact systems role at the intersection of databases, distributed systems, and AI infrastructure. You will define how core data systems are designed, scaled, and operated in an environment where workloads are evolving quickly and requirements are often ambiguous.</p>\n<p>Your responsibilities will include owning the design and evolution of the transactional systems that power the platform, defining architecture for database and storage systems under high-throughput, low-latency workloads, making and driving decisions around data modeling, indexing, replication, and consistency, debugging and resolving complex production issues, establishing standards for reliability, observability, and operability across the platform, partnering with product and research teams to support evolving and often ambiguous requirements, driving improvements in performance, scalability, and cost across the system, mentoring engineers and raising the bar for system design and technical decision-making.</p>\n<p>Key qualifications include 10+ years of experience building and operating production systems at scale, with ownership of critical infrastructure, strong experience designing and operating transactional systems and databases, deep understanding of data modeling, indexing, transactions, concurrency, and consistency tradeoffs, experience owning systems with strict reliability and performance requirements in production, strong experience debugging complex production issues and reasoning about failure modes, experience designing distributed systems or large-scale infrastructure where tradeoffs are non-trivial, proven ability to define architecture and drive technical decisions end-to-end, strong judgment in balancing performance, reliability, and cost, ability to operate effectively in ambiguous, fast-moving environments with high ownership.</p>\n<p>Preferred qualifications include experience with database internals, storage systems, or query engines, experience building infrastructure for AI/ML systems or data platforms, experience in early-stage or high-growth environments.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6acd8036-5ec","directApply":true,"hiringOrganization":{"@type":"Organization","name":"World Labs","sameAs":"https://www.worldlabs.ai","logo":"https://logos.yubhub.co/worldlabs.ai.png"},"x-apply-url":"https://job-boards.greenhouse.io/worldlabs/jobs/4194381009","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$200-$300k base salary (good-faith estimate for San Francisco Bay Area upon hire; actual offer based on experience, skills, and qualifications)","x-skills-required":["database internals","storage systems","query engines","data modeling","indexing","transactions","concurrency","consistency","distributed systems","large-scale infrastructure"],"x-skills-preferred":["AI/ML systems","data platforms"],"datePosted":"2026-04-17T13:09:33.493Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"database internals, storage systems, query engines, data modeling, indexing, transactions, concurrency, consistency, distributed systems, large-scale infrastructure, AI/ML systems, data platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":300000,"maxValue":300000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ceba9e5b-250"},"title":"Senior Backend Engineer, Product and Infra","description":"<p>We&#39;re looking for a Senior Backend Engineer to build the systems and services that power our product experience. You&#39;ll own the backend infrastructure that makes our content discoverable, our features responsive, and our platform reliable at scale.</p>\n<p>Your work will directly shape what users experience: designing APIs that serve rich content, building services that handle real-time interactions, implementing content-matching systems for rights and safety, and ensuring our platform performs under load. You&#39;ll architect systems that are fast, correct, and maintainable.</p>\n<p>You&#39;ll collaborate closely with Product, ML Research, and Mobile/Web teams to ship features that matter. We use Python, Go, BigQuery, Pub/Sub, and a microservices architecture,but we care more about good judgment than specific tool experience.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design and maintain application-level data models that organize rich content into canonical structures optimized for product features, search, and retrieval.</li>\n<li>Build high-reliability ETLs and streaming pipelines to process usage events, analytics data, behavioral signals, and application logs.</li>\n<li>Develop data services that expose unified content to the application, such as metadata access APIs, indexing workflows, and retrieval-ready representations.</li>\n<li>Implement and refine fingerprinting pipelines used for deduplication, rights attribution, safety checks, and provenance validation.</li>\n<li>Own data consistency between ingestion systems, application surfaces, metadata storage, and downstream reporting environments.</li>\n<li>Define and track key operational metrics, including latency, completeness, accuracy, and event health.</li>\n<li>Collaborate with Product teams to ensure content structures and APIs support evolving features and high-quality user experiences.</li>\n<li>Partner with Analytics and Research teams to deliver clean usage datasets for experimentation, model evaluation, reporting, and internal insights.</li>\n<li>Operate large analytical workloads in BigQuery and build reusable Dataflow/Beam components for structured processing.</li>\n<li>Improve reliability and scale by designing robust schema evolution strategies, idempotent pipelines, and well-instrumented operational flows.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Experience building production backend services and APIs at scale</li>\n<li>Experience building ETL/ELT pipelines, event processing systems, and structured data models for applications or analytics</li>\n<li>Strong background in data modeling, metadata systems, indexing, or building canonical representations for heterogeneous content</li>\n<li>Proficiency in Python, Go, SQL, and scalable data-processing frameworks (Dataflow/Beam, Spark, or similar)</li>\n<li>Familiarity with BigQuery or other analytical data warehouses and strong comfort optimizing large queries and schemas</li>\n<li>Experience with event-driven architectures, Pub/Sub, or Kafka-like systems</li>\n<li>Strong understanding of data quality, schema evolution, lineage, and operational reliability</li>\n<li>Ability to design pipelines that balance cost, latency, correctness, and scale</li>\n<li>Clear communication skills and an ability to collaborate closely with Product, Research, and Analytics stakeholders</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Experience building application-facing APIs or microservices that expose structured content</li>\n<li>Background in information retrieval, indexing systems, or search infrastructure</li>\n<li>Experience with fingerprinting, perceptual hashing, audio similarity metrics, or content-matching algorithms</li>\n<li>Familiarity with ML workflows and how downstream analytics and usage data feed back into research pipelines</li>\n<li>Understanding of batch + streaming architectures and how to blend them effectively</li>\n<li>Experience with Go, Next.js, or React Native for occasional full-stack contributions</li>\n</ul>\n<p><strong>Why Join Us</strong></p>\n<p>You will design the core data services and pipelines that power our product experience, analytics, and business operations. You’ll work on high-impact data challenges involving real-time signals, large-scale metadata systems, and cross-platform consistency. You’ll join a small, fast-moving team where you’ll shape the structure, reliability, and intelligence of our downstream data ecosystem.</p>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Highly competitive salary and equity</li>\n<li>Quarterly productivity budget</li>\n<li>Flexible time off</li>\n<li>Fantastic office location in Manhattan</li>\n<li>Productivity package, including ChatGPT Plus, Claude Code, and Copilot</li>\n<li>Top-notch private health, dental, and vision insurance for you and your dependents</li>\n<li>401(k) plan options with employer matching</li>\n<li>Concierge medical/primary care through One Medical and Rightway</li>\n<li>Mental health support from Spring Health</li>\n<li>Personalized life insurance, travel assistance, and many other perks</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ceba9e5b-250","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Udio","sameAs":"https://www.udio.com/","logo":"https://logos.yubhub.co/udio.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/udio/jobs/4987729008","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,000 - $220,000","x-skills-required":["Python","Go","BigQuery","Pub/Sub","Data modeling","Metadata systems","Indexing","Canonical representations","ETL/ELT pipelines","Event processing systems","Structured data models","Scalable data-processing frameworks","Analytical data warehouses","Event-driven architectures","Kafka-like systems","Data quality","Schema evolution","Lineage","Operational reliability"],"x-skills-preferred":["Application-facing APIs","Microservices","Information retrieval","Indexing systems","Search infrastructure","Fingerprinting","Perceptual hashing","Audio similarity metrics","Content-matching algorithms","ML workflows","Batch + streaming architectures"],"datePosted":"2026-04-17T13:05:20.076Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Go, BigQuery, Pub/Sub, Data modeling, Metadata systems, Indexing, Canonical representations, ETL/ELT pipelines, Event processing systems, Structured data models, Scalable data-processing frameworks, Analytical data warehouses, Event-driven architectures, Kafka-like systems, Data quality, Schema evolution, Lineage, Operational reliability, Application-facing APIs, Microservices, Information retrieval, Indexing systems, Search infrastructure, Fingerprinting, Perceptual hashing, Audio similarity metrics, Content-matching algorithms, ML workflows, Batch + streaming architectures","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2dd69016-e4d"},"title":"Senior Engineer, Foundry Applications","description":"<p>We&#39;re growing quickly and looking for a Full-Stack Software Engineer to help design, develop, and deploy robust applications that directly enable manufacturing, operational planning, and execution , all within a Foundry-integrated ecosystem.</p>\n<p>As a Senior Engineer, Foundry Applications, you will execute the development and release of Foundry-backed applications that power MES, ERP, and operational analytics. You will develop end-to-end features: From backend services and data transformations to user-facing interfaces , ensuring performance, maintainability, and security.</p>\n<p>Collaborate with production managers, engineers, and supply chain teams to understand workflows and deliver scalable software solutions. Leverage Palantir Foundry: Use Foundry&#39;s APIs, ontology, and visualization tools to integrate data across multiple domains. Deliver continuously: Write well-tested, maintainable code and iterate quickly based on real-world feedback.</p>\n<p>Shape the roadmap: Contribute to technical direction, make sound architecture decisions, and help scale our Foundry development capabilities.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2dd69016-e4d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/c2479afc-d804-4b2a-a3b8-0b111f785e8e","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$125,000 - $175,000 a year","x-skills-required":["full-stack application development","Python data-processing pipelines","data modeling","software design","testing"],"x-skills-preferred":["Palantir Foundry","Typescript","modern front-end frameworks","software for manufacturing","modern software delivery practices"],"datePosted":"2026-04-17T13:04:07.172Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dallas"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"full-stack application development, Python data-processing pipelines, data modeling, software design, testing, Palantir Foundry, Typescript, modern front-end frameworks, software for manufacturing, modern software delivery practices","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":125000,"maxValue":175000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5c694170-8a7"},"title":"Senior Engineer, Full-Stack Software - Foundry Applications","description":"<p>The Applications team within Software Integration &amp; Operations at Shield AI builds internal tools powering critical operations across Production, Customer Success, Supply Chain, and Engineering. We&#39;re looking for a Full-Stack Software Engineer to design, develop, and deploy robust applications enabling manufacturing, operational planning, and execution within a Foundry-integrated ecosystem.</p>\n<p>Key responsibilities include executing the development and release of Foundry-backed applications, developing end-to-end features, collaborating with production managers and supply chain teams, leveraging Palantir Foundry, and delivering continuously.</p>\n<p>Required qualifications include 5+ years of experience in full-stack application development, hands-on experience designing and building React applications and TypeScript/Javascript-based backends, strong skills in data modeling, and solid foundation in software design, testing, and documentation.</p>\n<p>Preferred qualifications include familiarity with Palantir Foundry, hands-on experience designing and operating Python data-processing pipelines, and experience building software for manufacturing, supply chain, or aircraft maintenance.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5c694170-8a7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/12901f0e-56f1-4654-939d-1605d5b10e75","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$120,000 - $190,000 a year","x-skills-required":["full-stack application development","React application development","TypeScript/Javascript-based backend development","data modeling","software design","testing","documentation"],"x-skills-preferred":["Palantir Foundry","Python data-processing pipelines","manufacturing software development","supply chain software development","aircraft maintenance software development"],"datePosted":"2026-04-17T13:03:38.369Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dallas"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"full-stack application development, React application development, TypeScript/Javascript-based backend development, data modeling, software design, testing, documentation, Palantir Foundry, Python data-processing pipelines, manufacturing software development, supply chain software development, aircraft maintenance software development","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120000,"maxValue":190000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_319e8ea9-a7e"},"title":"Sr. Manager of Sales Compensation","description":"<p>We are seeking an experienced and highly analytical Senior Manager of Sales Compensation to develop, implement, and manage sales commission plans that drive performance, align with business objectives, and ensure accuracy and timely payment to the sales organisation.</p>\n<p>The ideal candidate will possess a deep understanding of sales processes, compensation best practices, and strong leadership skills to manage the end-to-end compensation cycle, specifically within the high-tech industry.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li><p>Leading the annual review and design process for all sales incentive compensation plans, ensuring alignment with corporate strategy, sales goals, competitive market data, and industry best practices.</p>\n</li>\n<li><p>Modelling, analysing, and forecasting the financial impact of proposed compensation plan changes, presenting data-driven recommendations to executive leadership.</p>\n</li>\n<li><p>Collaborating with Sales, Finance, HR, and Legal teams to ensure plan feasibility, compliance, and effective rollout.</p>\n</li>\n<li><p>Overseeing the end-to-end administration of sales compensation, including territory and quota management, data integrity, calculation, and conflict resolution.</p>\n</li>\n<li><p>Managing and optimising the sales compensation process to ensure accurate and efficient compensation calculations.</p>\n</li>\n<li><p>Evaluating and leading implementation of a robust, scalable compensation system.</p>\n</li>\n<li><p>Developing and maintaining documentation, policies, and procedures related to sales compensation plans and processes.</p>\n</li>\n<li><p>Ensuring compliance with all internal policies and external regulations.</p>\n</li>\n<li><p>Generating regular and ad-hoc compensation reports and dashboards to provide insights into plan performance, sales effectiveness, and cost of sales.</p>\n</li>\n<li><p>Conducting quarterly and annual audits of compensation data and calculations to ensure accuracy and resolve discrepancies.</p>\n</li>\n<li><p>Analysing sales attainment, plan effectiveness, and making data-driven recommendations for plan adjustments.</p>\n</li>\n<li><p>Serving as the primary subject matter expert on all sales compensation matters, including industry standards and emerging trends.</p>\n</li>\n<li><p>Developing and delivering training materials and presentations to educate the sales force and management on compensation plan details and changes.</p>\n</li>\n<li><p>Mentoring and guiding junior team members, fostering a culture of accuracy, accountability, and continuous improvement.</p>\n</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_319e8ea9-a7e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/03f58f8e-e3fe-4b52-a77e-14c154629f3f","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$135,000 - $205,000 a year","x-skills-required":["Sales Compensation","Finance","Sales Operations","Sales Performance Management (SPM) software","Excel","SQL","BI platforms"],"x-skills-preferred":["MBA or Master's degree in a quantitative field","Experience in the Aerospace & Defense, or High-tech industry","Advanced proficiency in data modelling and analysis"],"datePosted":"2026-04-17T13:01:01.485Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Sales Compensation, Finance, Sales Operations, Sales Performance Management (SPM) software, Excel, SQL, BI platforms, MBA or Master's degree in a quantitative field, Experience in the Aerospace & Defense, or High-tech industry, Advanced proficiency in data modelling and analysis","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":135000,"maxValue":205000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_eb6eaa25-590"},"title":"Salesforce Developer","description":"<p>Job Overview</p>\n<p>We are seeking a skilled Salesforce Developer to design, develop, and maintain scalable solutions within the Salesforce platform. This role will support business processes across sales, service, and operations while ensuring alignment with enterprise architecture and integration strategies.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Design, develop, and implement Salesforce solutions using Apex, Lightning, and related technologies</li>\n<li>Customize Salesforce objects, workflows, and automation</li>\n<li>Develop and maintain integrations with external systems such as NetSuite and MuleSoft</li>\n<li>Collaborate with business analysts and architects to translate requirements</li>\n<li>Ensure code quality through best practices and testing</li>\n<li>Support deployment, release management, and CI/CD processes</li>\n<li>Troubleshoot and resolve system issues</li>\n<li>Maintain documentation for solutions and enhancements</li>\n<li>Ensure platform performance, scalability, and security</li>\n</ul>\n<p>Qualifications</p>\n<ul>\n<li>5 to 8 plus years of Salesforce development experience</li>\n<li>Strong experience with Apex, Lightning Web Components, and Visualforce</li>\n<li>Experience with Salesforce Sales Cloud and Service Cloud</li>\n<li>Experience with integrations using APIs and middleware</li>\n<li>Familiarity with DevOps tools and deployment pipelines</li>\n<li>Strong understanding of data modeling and platform limits</li>\n<li>Salesforce certifications preferred</li>\n<li>Bachelor’s degree in Computer Science or related field</li>\n</ul>\n<p>Physical Demands</p>\n<ul>\n<li>Prolonged periods of sitting at a desk and working on a computer.</li>\n<li>Occasional standing and walking within the office.</li>\n<li>Manual dexterity to operate a computer keyboard, mouse, and other office equipment.</li>\n<li>Visual acuity to read screens, documents, and reports.</li>\n<li>Occasional reaching, bending, or stooping to access file drawers, cabinets, or office supplies.</li>\n<li>Lifting and carrying items up to 20 pounds occasionally (e.g., office supplies, packages).</li>\n</ul>\n<p>Additional Information</p>\n<ul>\n<li>Benefits: Medical Insurance, Dental and Vision Insurance, Time Off, Parental Leave, Competitive Salary, Retirement Plan, Stock Options, Life and Disability Insurance, Pet Insurance</li>\n<li>Saronic CCPA Notice for Candidates and California Employees: This role requires access to export-controlled information or items that require “U.S. Person” status.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_eb6eaa25-590","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Saronic Technologies","sameAs":"https://www.saronictechnologies.com/","logo":"https://logos.yubhub.co/saronictechnologies.com.png"},"x-apply-url":"https://jobs.lever.co/saronic/a0df621a-b4e3-4f2c-8ad1-da14f8790b4d","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Salesforce","Apex","Lightning","NetSuite","MuleSoft","DevOps","Data Modeling","Platform Limits"],"x-skills-preferred":[],"datePosted":"2026-04-17T13:00:17.928Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Salesforce, Apex, Lightning, NetSuite, MuleSoft, DevOps, Data Modeling, Platform Limits"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_85f1ada0-78d"},"title":"Security Engineer","description":"<p>We&#39;re seeking a Security Engineer at the senior-level or above on our Security Operations team with strong detection engineering experience. You&#39;ll design and develop high-fidelity detection content, build and operate the data pipelines that power our security operations, develop automation playbooks that accelerate response, and work across a uniquely diverse telemetry landscape spanning cloud infrastructure, embedded vessel platforms, corporate systems, and operational technology.</p>\n<p>This role is heavily weighted toward detection engineering. You should think in terms of adversary behaviour and telemetry coverage, not just alert triage. You&#39;ll own detections end-to-end: from identifying gaps in coverage, through designing and testing detection logic, to tuning and validating in production.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li><p>Design, build, test, and tune high-fidelity detection rules and analytic queries across endpoint, cloud, network, identity, and DLP telemetry sources</p>\n</li>\n<li><p>Develop and maintain detection content using detection-as-code practices including version-controlled logic, automated testing, and CI/CD deployment</p>\n</li>\n<li><p>Map detection coverage to MITRE ATT&amp;CK, identify gaps, and prioritise new detection development based on threat intelligence and business risk</p>\n</li>\n<li><p>Engineer correlation rules, behavioural analytics, and anomaly-based detections that minimise false positives while surfacing real adversary tradecraft</p>\n</li>\n<li><p>Own the detection lifecycle from initial development through production tuning, performance monitoring, and retirement</p>\n</li>\n<li><p>Build and operate pipelines to ingest, normalise, enrich, and manage security telemetry at scale across diverse data sources, using Terraform and infrastructure-as-code practices to deploy and maintain logging and detection infrastructure</p>\n</li>\n<li><p>Design and maintain log collection, parsing, and enrichment configurations that ensure the right telemetry is available at the right fidelity for detection and investigation</p>\n</li>\n<li><p>Evaluate and onboard new telemetry sources as Saronic&#39;s infrastructure and threat landscape evolve</p>\n</li>\n<li><p>Monitor pipeline health, data quality, and ingestion reliability to ensure detections operate on complete and accurate data</p>\n</li>\n<li><p>Develop and manage automated response playbooks in SOAR platforms to accelerate containment and reduce analyst toil</p>\n</li>\n<li><p>Build automation that enriches alerts with contextual data, reducing investigation time and improving analyst decision-making</p>\n</li>\n<li><p>Support incident response efforts and translate lessons learned into improved detections and playbooks</p>\n</li>\n<li><p>Partner with SOC analysts, Cloud Security, Product Security, and IT teams to close visibility and detection gaps across environments</p>\n</li>\n<li><p>Collaborate with threat intelligence to ensure detection engineering is informed by current adversary TTPs relevant to defence, maritime, and autonomous systems</p>\n</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li><p>3+ years of hands-on experience in detection engineering, security operations, security automation, or a closely related security engineering role</p>\n</li>\n<li><p>Demonstrated experience designing, testing, and tuning detection rules and analytic queries across production security telemetry (endpoint, cloud, network, identity, or DLP)</p>\n</li>\n<li><p>Hands-on experience with SIEM platforms and proficiency with query languages such as SPL, KQL, or equivalent</p>\n</li>\n<li><p>Experience building and operating security data pipelines, including log ingestion, normalisation, enrichment, and data quality management</p>\n</li>\n<li><p>Understanding of data engineering concepts including ETL pipelines, data modelling, schema design, and indexing as applied to security telemetry</p>\n</li>\n<li><p>Hands-on coding experience in Python, PowerShell, Go, or Rust for security automation, detection tooling, or pipeline development, and familiarity with Terraform for managing detection and logging infrastructure as code</p>\n</li>\n<li><p>Understanding of MITRE ATT&amp;CK framework and its application to detection coverage and gap analysis</p>\n</li>\n<li><p>Ability to obtain and maintain a security clearance</p>\n</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li><p>Experience in defence, aerospace, robotics, autonomy, or other high-assurance environments</p>\n</li>\n<li><p>Experience with EDR platforms including custom detection rule creation and telemetry analysis</p>\n</li>\n<li><p>Experience with cloud-native detection in AWS and Microsoft 365/Azure</p>\n</li>\n<li><p>Experience using Terraform to deploy and manage security monitoring infrastructure, log pipeline components, or cloud-native security service configurations</p>\n</li>\n<li><p>Hands-on experience with incident response, threat hunting, or adversary emulation</p>\n</li>\n<li><p>Exposure to embedded Linux, operational technology, or ICS telemetry and detection</p>\n</li>\n<li><p>Familiarity with NIST SP 800-171, NIST SP 800-53, or CMMC and their logging and monitoring requirements</p>\n</li>\n<li><p>Relevant certifications such as GCIH, GCIA, GCDA, GSOM, OSDA, or OSCP</p>\n</li>\n</ul>\n<p>Additional Information:</p>\n<ul>\n<li><p>Benefits: Medical Insurance, Dental and Vision Insurance, Time Off, Parental Leave, Competitive Salary, Retirement Plan, Stock Options, Life and Disability Insurance, Pet Insurance</p>\n</li>\n<li><p>This role requires access to export-controlled information or items that require &#39;U.S. Person&#39; status.</p>\n</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_85f1ada0-78d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Saronic Technologies","sameAs":"https://www.saronictechnologies.com/","logo":"https://logos.yubhub.co/saronictechnologies.com.png"},"x-apply-url":"https://jobs.lever.co/saronic/79424778-76c1-41c6-8385-cba5f6ddc50e","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["detection engineering","security operations","security automation","SIEM platforms","query languages","data engineering","ETL pipelines","data modelling","schema design","indexing","Python","PowerShell","Go","Rust","Terraform","MITRE ATT&CK framework","security clearance"],"x-skills-preferred":["EDR platforms","cloud-native detection","incident response","threat hunting","adversary emulation","embedded Linux","operational technology","ICS telemetry","NIST SP 800-171","NIST SP 800-53","CMMC","GCIH","GCIA","GCDA","GSOM","OSDA","OSCP"],"datePosted":"2026-04-17T12:56:57.672Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"detection engineering, security operations, security automation, SIEM platforms, query languages, data engineering, ETL pipelines, data modelling, schema design, indexing, Python, PowerShell, Go, Rust, Terraform, MITRE ATT&CK framework, security clearance, EDR platforms, cloud-native detection, incident response, threat hunting, adversary emulation, embedded Linux, operational technology, ICS telemetry, NIST SP 800-171, NIST SP 800-53, CMMC, GCIH, GCIA, GCDA, GSOM, OSDA, OSCP"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2289a325-de7"},"title":"Software Engineer","description":"<p>We&#39;re looking for a Software Engineer to join our team. As a Software Engineer at Saronic Technologies, you will maintain and build the Foundry platform alongside Palantir specialists. You will collaborate with internal stakeholders, collect requirements, and write code that results in secure, timely, accurate, and reliable data models in Palantir Foundry.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Maintain and build the Foundry platform alongside Palantir specialists.</li>\n<li>Collaborate with internal stakeholders.</li>\n<li>Collect requirements and write code that results in secure, timely, accurate, and reliable data models in Palantir Foundry.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s degree.</li>\n<li>1+ years of experience in Information Technology.</li>\n<li>1-2 years of experience with hands-on client-facing development and data integration.</li>\n<li>Experience with Python and SQL.</li>\n<li>Experience with data modeling, ETL, and data visualization.</li>\n<li>Experience with Git-based code repositories and CI/CD workflows.</li>\n<li>Ability to lead fast-paced, Agile environments.</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Medical Insurance: Comprehensive health insurance plans covering a range of services.</li>\n<li>Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care.</li>\n<li>Time Off: Generous PTO and Holidays.</li>\n<li>Parental Leave: Paid maternity and paternity leave to support new parents.</li>\n<li>Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses.</li>\n<li>Retirement Plan: 401(k) plan.</li>\n<li>Stock Options: Equity options to give employees a stake in the company&#39;s success.</li>\n<li>Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage.</li>\n<li>Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2289a325-de7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Saronic Technologies","sameAs":"https://www.saronictechnologies.com/","logo":"https://logos.yubhub.co/saronictechnologies.com.png"},"x-apply-url":"https://jobs.lever.co/saronic/febf9919-3e0a-46ab-97aa-c53256e18bf7","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","SQL","data modeling","ETL","data visualization","Git","CI/CD workflows"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:56:40.538Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, SQL, data modeling, ETL, data visualization, Git, CI/CD workflows"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0549f86b-cdd"},"title":"Principal Software Engineer, Backend Systems","description":"<p>Role</p>\n<p>We&#39;re on a mission to redefine enterprise software, and we&#39;re looking for a Principal Software Engineer, Backend Systems to help push the boundaries of what&#39;s possible.</p>\n<p>If you love designing and scaling complex backend systems, have shipped major projects or entire products, and can think fluently about distributed systems, data modeling, API design, and integrations, this role is for you.</p>\n<p>You&#39;ll architect and scale our Go/Postgres/Redis/GraphQL backend, working alongside world-class engineers and product minds to drive high-impact projects, lead critical design discussions, and collaborate directly with customers to shape a platform that&#39;s transforming supply chain, manufacturing, and beyond.</p>\n<p>The industry is rooting for us, and you&#39;ll play a pivotal role in making it happen.</p>\n<p>This is a high-autonomy, high-impact individual contributor role, but if you&#39;re interested in growing into people management, the opportunity is there depending on your interest and performance.</p>\n<p>Product</p>\n<p>We&#39;re building the AI-first ERP to replace decades-old giants like SAP and Oracle, transforming how enterprises operate.</p>\n<p>Our platform can generate any enterprise workflow application in minutes,a dramatic leap from the 1-2 years it traditionally takes IT teams to build them,giving process owners in supply chain, manufacturing, and operations the power to standardize, streamline, and drive their work to completion, no matter the complexity.</p>\n<p>This breakthrough is powered by our workflow, forms, and AI engines, as well as our in-house Large Tabular Model, a first-of-its-kind innovation.</p>\n<p>Customers aren&#39;t just adopting our platform,they&#39;re clamoring for more, rapidly expanding their use cases as we enter an exhilarating growth phase.</p>\n<p>As one user put it: “I’ve been waiting for this for 20 years.”</p>\n<p>Culture and Compensation</p>\n<p>We are a customer-obsessed, product-driven company that is building a flexible, hybrid/remote culture to enable the brightest minds in the industry.</p>\n<p>We are particularly interested in candidates based in our hubs of Seattle, San Francisco and New York, but we will consider candidates who live anywhere in the US, Canada, or Mexico.</p>\n<p>We have industry-leading compensation packages, including equity and health benefits.</p>\n<p>We are willing to sponsor US work authorization if needed.</p>\n<p>Requirements</p>\n<ul>\n<li><p>M.S. in Computer Science or a related field (B.S. in Computer Science or a related field will be considered with substantial relevant experience)</p>\n</li>\n<li><p>5+ years of industry experience as a backend software engineer, with a focus on large-scale, user-facing web applications in companies like Slack, Uber, or similar</p>\n</li>\n<li><p>Proven experience in the architecture and design of large data systems, particularly for software as a service (SaaS)</p>\n</li>\n<li><p>Extensive experience in database systems development, data modeling, distributed systems and building robust application backends</p>\n</li>\n<li><p>Fluency with databases, APIs and modern backend technologies (experience with Go and GraphQL is strongly preferred, with the ability to quickly learn new technologies as needed)</p>\n</li>\n<li><p>A builder&#39;s spirit (you have a track record of building projects for fun, staying updated with open-source developments, etc.)</p>\n</li>\n<li><p>Ability to lead projects independently and collaboratively in a fast-paced startup environment</p>\n</li>\n<li><p>Excellent written and verbal communication skills</p>\n</li>\n<li><p>Strong enthusiasm for continuous learning and professional growth and for mentoring peers to help them grow as engineers</p>\n</li>\n</ul>\n<p>Responsibilities</p>\n<ul>\n<li><p>Architect, design and develop scalable and robust backend systems for large data software-as-a-service (SaaS) applications, ensuring high performance and reliability.</p>\n</li>\n<li><p>Collaborate with cross-functional teams including Design, Product Management and industry experts to build high-quality product features.</p>\n</li>\n<li><p>Lead and mentor a team of engineers, providing guidance and expertise in backend development, database systems and distributed systems.</p>\n</li>\n<li><p>Stay abreast of emerging technologies and industry trends, incorporating new developments into the backend architecture and processes where appropriate.</p>\n</li>\n<li><p>Participate in code reviews, technical discussions and decision-making processes to maintain high standards of code quality and best practices.</p>\n</li>\n<li><p>Drive the adoption of best practices in backend development, data modeling and API design, ensuring the scalability and maintainability of the system.</p>\n</li>\n<li><p>Champion a culture of innovation, encouraging and leading initiatives to explore new technologies and improve existing systems.</p>\n</li>\n</ul>\n<p>Additional Information</p>\n<p>If this sounds exciting, please apply and we&#39;ll get back to you promptly if we see a fit!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0549f86b-cdd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Regrello","sameAs":"https://regrello.com","logo":"https://logos.yubhub.co/regrello.com.png"},"x-apply-url":"https://jobs.lever.co/regrello/3115193b-6b7b-4e03-bfcd-8bfab06e6e55","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"USD 175,000-200,000 per year","x-skills-required":["M.S. in Computer Science or a related field","5+ years of industry experience as a backend software engineer","Proven experience in the architecture and design of large data systems","Extensive experience in database systems development, data modeling, distributed systems and building robust application backends","Fluency with databases, APIs and modern backend technologies"],"x-skills-preferred":["Go","GraphQL","Postgres","Redis"],"datePosted":"2026-04-17T12:54:04.425Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"M.S. in Computer Science or a related field, 5+ years of industry experience as a backend software engineer, Proven experience in the architecture and design of large data systems, Extensive experience in database systems development, data modeling, distributed systems and building robust application backends, Fluency with databases, APIs and modern backend technologies, Go, GraphQL, Postgres, Redis","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":175000,"maxValue":200000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2324ce80-532"},"title":"Data Scientist - Network Value","description":"<p>We believe that the way people interact with their finances will drastically improve in the next few years. We&#39;re dedicated to empowering this transformation by building the tools and experiences that thousands of developers use to create their own products.</p>\n<p>Plaid powers the tools millions of people rely on to live a healthier financial life. We work with thousands of companies like Venmo, SoFi, several of the Fortune 500, and many of the largest banks to make it easy for people to connect their financial accounts to the apps and services they want to use.</p>\n<p>The Network Value Data Science team is helping Plaid build an industry leading fintech consumer network by increasing access to, authorization for, and usability of Plaid&#39;s User&#39;s financial footprints. We embed within product teams to support OKRs and help execute on product roadmaps. We translate ambiguous product questions into tractable analysis, serve as analytical thought partners throughout the org, identify opportunities to build better products, and champion a data-first decision making approach everywhere we go.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Perform ad-hoc and strategic analyses to uncover opportunities for improved business outcomes and translate complex questions into actionable analytics projects.</li>\n<li>Design and maintain scalable data models and dashboards that increase visibility into core systems and drive operational excellence.</li>\n<li>Build and iterate on machine learning prototypes to power insight-driven products and unlock new sources of customer and business value.</li>\n<li>Define and track OKRs that quantify progress toward key business goals, ensuring alignment and accountability across teams.</li>\n<li>Design and analyze experiments to guide product decisions and optimize feature launches.</li>\n<li>Champion a data-first culture by promoting analytical rigor and evidence-based decision-making across the organization.</li>\n</ul>\n<p><strong>Qualifications</strong></p>\n<ul>\n<li>2+ years of experience as a Data Scientist or in a related analytics or data-focused role</li>\n<li>Strong track record of turning complex data into strategic insights and measurable business impact</li>\n<li>Proven ability to use experimentation, advanced analytics, and data storytelling to uncover opportunities that drive key product and business outcomes</li>\n<li>Strong technical foundation in SQL and Python for large-scale analysis, data modeling, and ML prototyping</li>\n<li>Experience developing and maintaining data pipelines and metrics frameworks using tools such as Airflow and dbt</li>\n<li>Background working with complex backend systems, ensuring data integrity, scalability, and operational reliability across platforms</li>\n<li>Skilled at partnering cross-functionally with product, engineering, and business teams to influence prioritization and strategy through clear, data-driven communication</li>\n</ul>\n<p><strong>Additional Information</strong></p>\n<p>Our mission at Plaid is to unlock financial freedom for everyone. To support that mission, we seek to build a diverse team of driven individuals who care deeply about making the financial ecosystem more equitable. We recognize that strong qualifications can come from both prior work experiences and lived experiences. We encourage you to apply to a role even if your experience doesn&#39;t fully match the job description.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2324ce80-532","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Plaid","sameAs":"https://plaid.com/","logo":"https://logos.yubhub.co/plaid.com.png"},"x-apply-url":"https://jobs.lever.co/plaid/18503c02-17a0-4c47-98c8-155b0b6ccc2a","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$176,400-$243,600 per year","x-skills-required":["SQL","Python","Machine Learning","Data Modeling","Data Pipelines","Metrics Frameworks","Airflow","dbt"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:52:02.474Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, Python, Machine Learning, Data Modeling, Data Pipelines, Metrics Frameworks, Airflow, dbt","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":176400,"maxValue":243600,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_571471d0-f38"},"title":"Product Lead - Growth","description":"<p>We believe that the way people interact with their finances will drastically improve in the next few years. We&#39;re dedicated to empowering this transformation by building the tools and experiences that thousands of developers use to create their own products.</p>\n<p>As Product Lead - Growth, you will oversee one of Plaid&#39;s highest-breadth portfolios. You&#39;ll lead PMs across Growth, Web, and Customer Foundations, setting strategy for funnel optimization, down-market acquisition, enterprise enablement, and the customer lifecycle model. You will partner deeply with Engineering, Design, BI/DS, Marketing, Sales, and Partnerships to shape growth motions across all GTM channels.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Lead PMs across Growth, Web, and Customer Foundations.</li>\n<li>Define the end-to-end growth strategy across website, PLG, SMB, enterprise, and partnerships.</li>\n<li>Own core funnel performance: traffic → MQL → onboarding → activation.</li>\n<li>Drive scalable SMB and long-tail customer acquisition motions.</li>\n<li>Partner with Sales and Partnerships to support large enterprise deals, identify product gaps, and shape deal mechanics.</li>\n<li>Oversee onboarding risk engine and customer compliance workflows.</li>\n<li>Build and operationalize Plaid’s Customer Lifecycle Model (CLM) to ensure clean customer state and entitlement logic.</li>\n<li>Use data to identify funnel opportunities, segment users, validate hypotheses, and drive experimentation.</li>\n<li>Make resource allocation and prioritization decisions across a high-breadth portfolio.</li>\n<li>Communicate strategy, tradeoffs, and insights to Executive, GTM, and Engineering leadership</li>\n</ul>\n<p><strong>Qualifications</strong></p>\n<ul>\n<li>8-10+ years of product management experience, including 2+ years managing PMs.</li>\n<li>Demonstrated success leading PLG or B2B growth at scale.</li>\n<li>Strong analytical and funnel optimization skills; comfort with data modeling and experimentation.</li>\n<li>Deep experience partnering with GTM teams (Sales, Partnerships, Marketing).</li>\n<li>Experience supporting enterprise deals or partnership motions.</li>\n<li>Demonstrated ability to operate across multiple domains: growth, risk, compliance, onboarding, internal systems.</li>\n<li>Excellent communication and cross-functional leadership skills</li>\n</ul>\n<p><strong>Additional Information</strong></p>\n<p>Our mission at Plaid is to unlock financial freedom for everyone. To support that mission, we seek to build a diverse team of driven individuals who care deeply about making the financial ecosystem more equitable.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_571471d0-f38","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Plaid","sameAs":"https://plaid.com/","logo":"https://logos.yubhub.co/plaid.com.png"},"x-apply-url":"https://jobs.lever.co/plaid/b87c595b-b231-4a86-bffa-450e3f9dc335","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$216,000-329,400 per year","x-skills-required":["product management","growth strategy","funnel optimization","data modeling","experimentation","cross-functional leadership"],"x-skills-preferred":["fintech","onboarding","risk","compliance"],"datePosted":"2026-04-17T12:51:31.721Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"product management, growth strategy, funnel optimization, data modeling, experimentation, cross-functional leadership, fintech, onboarding, risk, compliance","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":216000,"maxValue":329400,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_89603aee-e7d"},"title":"Senior Data Analyst, Product Analytics (Fixed Term 10 Months)","description":"<p>This role is offered for a 10-month, fixed-term duration.</p>\n<p>As a Senior Data Analyst on the Product Analytics team, you will be a strategic thought partner with substantial autonomy, shaping how we use data to build products that improve the lives of our members. You will work side-by-side with product managers, engineers, clinicians, and business leaders to uncover insights, influence roadmaps, and measure impact.</p>\n<p>The Product Analytics team develops the reports, tools, and insights that drive product decisions across Omada. Using data from app engagement, clinical outcomes, behavioral health indicators, and claims, you will help product and engineering teams identify and solve the most important challenges facing our customers and members. In this role, you will own high-visibility analytics workstreams end-to-end,from defining the problem to communicating recommendations to executives.</p>\n<p>Key job responsibilities include:</p>\n<ul>\n<li>Own the data architecture and data mapping strategy that powers our product reporting, ensuring teams can self-serve reliable insights.</li>\n<li>Deliver key performance insights through regular and ad-hoc analyses that guide investment decisions and the evolution of Omada programs.</li>\n<li>Lead product experimentation, including test design, sample size determination, A/B testing, and clear communication of results and trade-offs.</li>\n<li>Design, build, and maintain intuitive Amplitude and Tableau dashboards (or similar BI tools) that become the source of truth for product teams.</li>\n<li>Partner closely with data engineering and other data teams to improve data quality, standardize definitions, and scale reporting across the organization.</li>\n<li>Define and refine operational processes, and support system enhancements that make analytics and reporting more efficient and impactful.</li>\n<li>Collaborate across analytics, data science, engineering, clinical, and product teams to drive aligned, data-informed decisions.</li>\n<li>Explore and integrate AI-driven approaches into product analytics use cases to level up how we generate insights and automate decision support.</li>\n</ul>\n<p>About you:</p>\n<ul>\n<li>The ideal candidate should have at least 2+ years of deep expertise in product or mobile analytics, including user behavior analysis, ideally 7+ years of experience.</li>\n<li>Expert SQL skills with strong experience in designing and developing new queries and reports.</li>\n<li>Experience with A/B testing and statistical measurement.</li>\n<li>Ability to analyze various data sets, and develop succinct key metrics.</li>\n<li>Strong relational database design/development skills.</li>\n<li>Excellent problem-solving, interpersonal, and communication skills.</li>\n<li>Track record of working on multiple projects under aggressive deadlines.</li>\n<li>Bachelor&#39;s degree in Computer Science, Engineering, or Mathematics.</li>\n<li>Skills: Data Analysis and Visualization, Amplitude, Tableau, SQL, ETL, Databases, Data model, and Excel.</li>\n</ul>\n<p>Your impact:</p>\n<ul>\n<li>Your work will directly shape how Omada designs, launches, and iterates on member experiences. You will identify new ways to drive engagement and clinical outcomes, support new products and features, and demonstrate the value of existing innovations.</li>\n</ul>\n<p>You will love this job if you are:</p>\n<ul>\n<li>A highly motivated, self-driven professional who thrives with autonomy and ownership.</li>\n<li>Excited to work closely with product managers, engineers, clinicians, and business stakeholders to tell compelling stories through data.</li>\n<li>Energized by solving complex data challenges and building scalable solutions that others rely on every day.</li>\n</ul>\n<p>Bonus points for:</p>\n<ul>\n<li>Knowledge of additional coding languages (Python or R).</li>\n<li>Advanced experience in data visualization.</li>\n<li>Background in healthcare and experience with PHI.</li>\n<li>Knowledge of population health metrics.</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Competitive salary.</li>\n<li>Remote-first work-from-home culture.</li>\n<li>Flexible Time Off to help you rest, recharge, and connect with loved ones.</li>\n<li>Health, dental, and vision insurance (and above-market employer contributions).</li>\n<li>...and more!</li>\n</ul>\n<p>It takes a village to change healthcare. As we build together toward our mission, we strive to embody the following values in our day-to-day work. We hope these hold meaning for you as well as you consider Omada!</p>\n<ul>\n<li>Start with Trust. We listen closely and we operate with kindness. We provide respectful and candid feedback to each other.</li>\n<li>Seek Context. We ask to understand and we build connections. We do our research up front to move faster down the road.</li>\n<li>Act Boldly. We innovate daily to solve problems, improve processes, and find new opportunities for our members and customers.</li>\n<li>Deliver Results. We reward impact above output. We set a high bar, we&#39;re not afraid to fail, and we take pride in our work.</li>\n<li>Succeed Together. We prioritize Omada&#39;s progress above team or individual. We have fun as we get stuff done, and we celebrate together.</li>\n<li>Remember Why We&#39;re Here. We push through the challenges of changing healthcare because we know the destination is worth it.</li>\n</ul>\n<p>About Omada Health:</p>\n<p>Omada Health is a between-visit healthcare provider that addresses lifestyle and behavior change elements for individuals managing chronic conditions. Omada&#39;s multi-condition platform treats diabetes, hypertension, prediabetes, musculoskeletal, and GLP-1 management. With insights from connected devices and AI-supported tools, Omada care teams deliver care that is rooted in evidence and unique to every member, unlocking results at scale. With more than a decade of experience and data, and 29 peer-reviewed publications showcasing clinical and economic proof points, Omada&#39;s approach is designed to improve health outcomes and contain costs. Our customers include health plans, pharmacy benefit managers, health systems, and employers ranging from small businesses to Fortune 500s. At Omada, we aim to inspire and empower people to make lasting health changes on their own terms.</p>\n<p>For more information, visit: <a href=\"https://www.omadahealth.com/\">https://www.omadahealth.com/</a></p>\n<p>Omada is thrilled to share that we&#39;ve been certified as a Great Place to Work! Please click here for more information.</p>\n<p>We carefully hire the best talent we can find, which means actively seeking diversity of beliefs, backgrounds, education, and ways of thinking. We strive to build an inclusive culture where differences are celebrated and leveraged to inform better design and business decisions. Omada is proud to be an equal opportunity workplace and affirmative action employer. We are committed to equal opportunity regardless of race, color, religion, sex, gender identity, national origin, ancestry, citizenship, age, physical or mental disability, legally protected medical condition, family care status, military or veteran status, marital status, domestic partner status, sexual orientation, or any other basis protected by local, state, or federal laws.</p>\n<p>Below is a summary of salary ranges for this role in the following geographies:</p>\n<ul>\n<li>California, New York</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_89603aee-e7d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Omada Health","sameAs":"https://www.omadahealth.com/","logo":"https://logos.yubhub.co/omadahealth.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/omadahealth/jobs/7800367","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Data Analysis and Visualization","Amplitude","Tableau","SQL","ETL","Databases","Data model","Excel"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:51:17.069Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, USA"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"Data Analysis and Visualization, Amplitude, Tableau, SQL, ETL, Databases, Data model, Excel"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7b750523-8ff"},"title":"Staff Software Engineer, Data Engineering","description":"<p>We are seeking a Staff Software Engineer to lead the technical strategy and implementation of our enterprise data architecture, governance foundations, and analytics enablement tooling.</p>\n<p>In this role, you will be the primary engineering counterpart to the Senior Product Manager for Data Enablement &amp; Governance, jointly shaping the roadmap for enterprise analytics, shared definitions, and the tools that help Omada answer questions faster and more reliably.</p>\n<p>You will design and evolve core data products, define patterns and standards used across the company, and drive the technical execution of initiatives that ensure our metrics, reports, and data products are scalable, governed, and trustworthy.</p>\n<p>This is a high-impact, cross-functional Staff role working across Data Engineering, Data Science, Analytics, Product, IT, and business leaders.</p>\n<p><strong>Key Responsibilities:</strong></p>\n<p><strong>Enterprise Data Architecture</strong></p>\n<ul>\n<li>Own the vision and technical roadmap for Omada&#39;s enterprise data architecture, spanning ingestion, storage, modeling, and serving layers for analytics and applied statistics use cases.</li>\n<li>Design, implement, and evolve scalable, secure, and cost-efficient data solutions (datalakes, warehouses, marts, semantic layers) that support governed, cross-functional analytics and self-service.</li>\n<li>Define and socialize architectural patterns, data contracts, and integration standards used by data and product teams across the organization.</li>\n<li>Anticipate future needs (e.g., new product lines, new modalities, AI/ML workloads) and drive proactive architectural changes rather than reacting to incidents or point-in-time requests.</li>\n</ul>\n<p><strong>Data Modeling, Quality, and Governance Foundations</strong></p>\n<ul>\n<li>Lead the design of logical and physical data models to support enterprise metrics, dashboards, and ad hoc analytics, with a focus on reusability and clear ownership.</li>\n<li>Implement robust data quality, validation, and monitoring frameworks that underpin trusted “single source of truth” definitions for core concepts (e.g., active member, MAU, GLP-1 member).</li>\n<li>Partner with the Senior Product Manager, Data Enablement &amp; Governance to translate governance decisions (definitions, ownership, change-management processes) into concrete technical implementations in the data platform.</li>\n<li>Set standards and review mechanisms to ensure new pipelines, marts, and reports align with enterprise definitions and governance policies.</li>\n<li>Continuously improve performance, scalability, and cost-efficiency of data workflows and storage; lead deep dives and remediation for complex production issues.</li>\n</ul>\n<p><strong>Enterprise Data Products Lifecycle</strong></p>\n<ul>\n<li>In close partnership with the Senior PM, define and deliver core, reusable data products (e.g., engagement, clinical, financial, client, care delivery datasets) that power dashboards, reporting, and self-service analytics.</li>\n<li>Co-Architect and implement technical foundations for AI-assisted analytics tools, governed semantic layers, and reporting applications that make analysts and business users more efficient.</li>\n<li>Partner with Product and Engineering teams owning tools like Amplitude, Tableau, and internal reporting tools to ensure consistent instrumentation, mapping to enterprise definitions, and scalable access patterns.</li>\n<li>Translate business and product requirements into resilient schemas, data services, and interfaces that are usable, maintainable, and auditable.</li>\n<li>Ensure production data delivery meets defined SLAs and supports downstream BI, reporting apps, and applied statistics workloads.</li>\n<li>Play a key role in cross-functional forums (e.g., Data Governance Committee, analytics communities) as the technical voice for feasibility, risk, and long-term platform health.</li>\n</ul>\n<p><strong>Technical Leadership, Mentorship, and Culture</strong></p>\n<ul>\n<li>Lead large, multi-team technical initiatives,from design to implementation and rollout,setting a high bar for design docs, reviews, and execution quality.</li>\n<li>Mentor senior and mid-level engineers, elevating the team’s skills in data modeling, pipeline design, governance, and platform thinking.</li>\n<li>Help shape playbooks for how product squads and spokes engage with central data teams on new metrics, data products, and applied stats projects.</li>\n<li>Partner closely with Analytics, Data Science, Product, and business leaders to ensure data architecture and governance decisions are aligned with company OKRs and measurable business value.</li>\n<li>Proactively identify complexity, duplication, and fragility in existing systems; drive simplification and standardization with sustainable solutions.</li>\n<li>Model Omada’s values in day-to-day work, fostering a culture of trust, context-seeking, bold thinking, and high-impact delivery.</li>\n</ul>\n<p><strong>About You:</strong></p>\n<ul>\n<li>8+ years of experience building, maintaining, and orchestrating scalable data platforms and high-quality production pipelines, including significant experience in analytics or warehousing environments.</li>\n<li>Demonstrated Staff-level impact: leading cross-team technical initiatives, making architectural decisions that shaped a multi-year roadmap, and influencing stakeholders beyond your immediate team.</li>\n<li>Deep experience with cloud data ecosystems (e.g., AWS) and modern data warehouses (e.g., Redshift, Snowflake, BigQuery), including MPP query optimization.</li>\n<li>Strong background in data modeling for OLTP and OLAP, and designing reusable data products for BI, reporting, and advanced analytics.</li>\n<li>Hands-on experience implementing data quality, observability, and governance frameworks, ideally in a regulated or PHI/PII-sensitive environment.</li>\n<li>Experience partnering with Product Management and Analytics to define and deliver platform capabilities, not just point solutions.</li>\n</ul>\n<p><strong>Technical Skills:</strong></p>\n<ul>\n<li>Strong proficiency in SQL (analytical and performance-tuned) and experience with relational and MPP databases.</li>\n<li>Proficiency in at least one modern programming language used in data engineering (e.g., Python, Java, Scala) and comfort applying software engineering best practices (testing, CI/CD, code review).</li>\n<li>Experience with workflow orchestration and data integration tools (e.g., Airflow) and event-driven or streaming patterns where appropriate.</li>\n<li>Familiarity with BI and analytics tools (e.g., Tableau, Amplitude, or similar) and how they integrate with governed data layers.</li>\n<li>Experience with data governance concepts (ownership, lineage, definitions, access controls) and their technical implementation in a modern data stack.</li>\n<li>Familiarity with AI tools for development.</li>\n</ul>\n<p><strong>Communication &amp; Working Style:</strong></p>\n<ul>\n<li>Excellent communication and collaboration skills, with the ability to convey complex technical concepts to non-technical stakeholders.</li>\n<li>Highly self-directed and comfortable operating in ambiguous, cross-functional problem spaces, creating clarity and direction where none exists.</li>\n<li>Strong sense of ownership and bias for impact; you care about outcomes for members, customers, and internal users, not just elegant systems.</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<ul>\n<li>Competitive salary with generous annual cash bonus</li>\n<li>Equity grants</li>\n<li>Remote first work from home culture</li>\n<li>Flexible Time Off to help you recharge</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7b750523-8ff","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Omada Health","sameAs":"https://www.omadahealth.com/","logo":"https://logos.yubhub.co/omadahealth.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/omadahealth/jobs/7753330","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Cloud data ecosystems","Modern data warehouses","MPP query optimization","Data modeling","Data quality","Data governance","Workflow orchestration","Data integration","Event-driven or streaming patterns","BI and analytics tools","AI tools for development"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:50:06.765Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, USA"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"SQL, Cloud data ecosystems, Modern data warehouses, MPP query optimization, Data modeling, Data quality, Data governance, Workflow orchestration, Data integration, Event-driven or streaming patterns, BI and analytics tools, AI tools for development"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_89e1235e-be6"},"title":"Senior Salesforce Administrator","description":"<p>The Senior Salesforce Administrator is a core technical contributor on the Business Systems team. This role is responsible for configuring, automating, and continuously optimising Omada&#39;s Salesforce ecosystem to support business operations.</p>\n<p>The Senior Salesforce Administrator partners with business systems analysts, developers, and cross-functional stakeholders to design and maintain scalable, high-quality solutions that improve business processes and operational efficiency.</p>\n<p>This role serves as a platform subject matter expert, owning Salesforce configuration standards, advanced automation, and technical escalations.</p>\n<p>Success in this role requires strong technical depth in Salesforce administration and automation, knowledge of platform best practices, and effective collaboration in a sprint-based, cross-functional environment.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li><p>Configuring and maintaining Salesforce objects, fields, page layouts, record types, validation rules, and security settings to support evolving business processes and uphold platform configuration standards.</p>\n</li>\n<li><p>Designing, building, and maintaining Salesforce Flows (record-triggered, screen, subflows, and scheduled) to automate and streamline workflows, establishing reusable, scalable automation patterns.</p>\n</li>\n<li><p>Evaluating business and functional requirements and determining the appropriate automation approach in partnership with analysts and developers.</p>\n</li>\n<li><p>Partnering closely with developers on solutions that combine declarative configuration and programmatic components, providing guidance on Salesforce best practices and configuration impacts.</p>\n</li>\n<li><p>Supporting sprint-based delivery, including build, test, deployment, and post-release validation activities.</p>\n</li>\n<li><p>Performing data loads, data updates, and ongoing data quality maintenance to ensure accuracy and integrity.</p>\n</li>\n<li><p>Serving as a primary escalation point to troubleshoot and resolve complex Salesforce issues and user requests, coordinating with analysts and developers as needed.</p>\n</li>\n<li><p>Defining and maintaining documentation, standards, and continuous improvement practices for the Salesforce platform and related processes.</p>\n</li>\n</ul>\n<p>Bonus points for experience partnering closely with Salesforce developers on solutions that combine declarative and programmatic components, mentoring or providing guidance to other administrators, analysts, or power users on Salesforce configuration and best practices, working in Agile or other sprint-based delivery models, configuring or applying Salesforce AI capabilities (Einstein/Agentforce) to improve business workflows and user experience, and experience in healthcare, digital health, or other regulated environments.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_89e1235e-be6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Omada Health","sameAs":"https://www.omadahealth.com/","logo":"https://logos.yubhub.co/omadahealth.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/omadahealth/jobs/7648699","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Salesforce Administration","Salesforce Automation","Salesforce Configuration","Salesforce Data Model","Salesforce Security Model","Salesforce Flows","Agile Methodology","Cross-Functional Collaboration"],"x-skills-preferred":["Salesforce AI Capabilities","Healthcare Experience","Digital Health Experience","Regulated Environment Experience"],"datePosted":"2026-04-17T12:49:57.712Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, USA"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"Salesforce Administration, Salesforce Automation, Salesforce Configuration, Salesforce Data Model, Salesforce Security Model, Salesforce Flows, Agile Methodology, Cross-Functional Collaboration, Salesforce AI Capabilities, Healthcare Experience, Digital Health Experience, Regulated Environment Experience"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_dcb44b1b-ec9"},"title":"Senior Data Analyst","description":"<p>At Neighbor, our vision is to bring communities together by solving our neighbors&#39; biggest challenges.</p>\n<p>We&#39;re building the largest hyperlocal marketplace the world has seen. Our marketplace is already flourishing in all 50 states and we&#39;re just getting started!</p>\n<p>As a Senior Data Analyst, you are an independent driver and architect, entrusted with solving Neighbor&#39;s most ambitious, complex, and ambiguous problems. You will leverage advanced data analytics to not only execute the data strategy, but also anticipate the challenges that lie ahead. You will bridge the gap between executive vision and technical implementation, pioneering robust data models and pipelines while steering executive-level data strategy and fundamentally shaping the company&#39;s data-driven business decisions with your insights.</p>\n<p><strong>Primary Responsibilities</strong></p>\n<ul>\n<li>Help build a world-class Data &amp; Analytics team focused on data integrity, business intelligence, data-driven decisions, testing, accountability and research</li>\n<li>Lead the design and implementation of our data modeling layers (Data Lakes/Warehouses), ensuring a &#39;single source of truth&#39; for a complex, two-sided marketplace</li>\n<li>Beyond SQL, you will build and maintain robust ETL/ELT pipelines and deploy predictive models to forecast and automate insights</li>\n<li>Mentor junior team members and empower non-technical stakeholders to make autonomous, data-informed decisions through world-class BI tooling</li>\n<li>Move beyond &#39;what happened&#39; to &#39;what will happen.&#39; Develop statistical frameworks to test hypotheses and measure the impact of new product features</li>\n<li>Act as the primary data partner for Product, Marketing, Sales, Engineering, Finance, and Customer Success leadership</li>\n<li>Become an expert on all aspects of Neighbor&#39;s product, user, marketing and other data</li>\n</ul>\n<p><strong>Qualifications</strong></p>\n<ul>\n<li>Bachelor&#39;s degree in quantitative and/or technical fields (Math, Physics, Statistics, Economics, Computer Science, Engineering, etc.) OR 7 years working experience in data analytics</li>\n<li>5+ years of prior experience as a Data Analyst</li>\n<li>3+ years of experience solving complex, ambiguous problems independently and presenting results to stakeholders</li>\n<li>Experience with ETL such as data manipulation, organization and cleaning</li>\n<li>Experience modeling data lakes or data warehouses</li>\n<li>Experience with predictive modeling</li>\n<li>Advanced proficiency in reporting and visualization software</li>\n<li>Experience using mathematical, scientific and statistical techniques to analyze data</li>\n<li>Proven track record of using quantitative analysis to solve problems, and drive key business decisions</li>\n</ul>\n<p><strong>Additional Information</strong></p>\n<p>About Neighbor: Neighbor is the largest and most comprehensive marketplace for self storage and parking, with listings in almost every U.S. city. From storage facilities to neighborhood garages, driveways, and RV spots, Neighbor brings every option together in one simple search.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_dcb44b1b-ec9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Neighbor","sameAs":"https://www.neighbor.com/","logo":"https://logos.yubhub.co/neighbor.com.png"},"x-apply-url":"https://jobs.lever.co/neighbor/034252c8-d93c-40a0-95dc-2830273acba0","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["ETL","data modeling","predictive modeling","reporting and visualization software","mathematical and statistical techniques","SQL"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:49:13.658Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"U.S."}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"ETL, data modeling, predictive modeling, reporting and visualization software, mathematical and statistical techniques, SQL"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_f5ef2a17-622"},"title":"Senior Product Manager - Ledger","description":"<p>We&#39;re looking for a Senior Product Manager to own Mercury&#39;s Ledger platform, a double-entry bookkeeping engine that records every financial event as balanced, auditable entries. The Ledger team sits at the intersection of financial infrastructure, regulatory accountability, and org-wide platform adoption.</p>\n<p>As part of the journey, we would expect you to:</p>\n<ul>\n<li>Own the Ledger roadmap: Drive strategy and execution for Mercury&#39;s Double Entry Ledger (DEL),the financial system of record.</li>\n<li>Drive org-wide ledger adoption: Partner with payment rails teams to enable their flows on the ledger platform, ensuring event-to-ledger integration is clean, correct, and scalable.</li>\n<li>Build next generation financial infrastructure: Own the implementation of our financial processes across charts of accounts, transfer code definitions, and GL mapping frameworks that connect to our core systems of record.</li>\n<li>Partner with Finance and Accounting: Translate accounting requirements into product architecture, and translate product decisions back to stakeholders who think in debits, credits, journal entries, and regulatory reports.</li>\n<li>Set the financial data standard: Define what clean, reconcilable financial data looks like at Mercury. Work with the Reconciliation team to ensure every ledger entry is accurate, traceable, and exception-free.</li>\n<li>Navigate a complex, high-stakes stakeholder environment: Coordinate across banking platform teams, payment rails teams, Finance, Compliance, Risk,all of whom depend on your platform and have a voice in how it evolves.</li>\n</ul>\n<p>Some things that might make you successful in a role like this:</p>\n<ul>\n<li>7+ years of product management experience in fintech, financial services, or platform/infrastructure product roles.</li>\n<li>Real technical fluency: you can engage credibly with engineers on event-sourced architectures, database consistency models, double-entry accounting data models, and financial data pipelines,not as a generalist, but as a peer.</li>\n<li>Meaningful background in accounting, finance, or banking operations,you understand chart of accounts, GL mapping, journal entries, reconciliation workflows, and why they matter to regulators.</li>\n<li>Experience building and scaling internal platform products with broad adoption requirements,you know how to work with adopter teams, reduce onboarding friction, and drive org-wide standardization across a distributed set of engineers.</li>\n<li>A highly structured, data-driven approach to product decisions: you think in systems, model second and third-order effects, and communicate tradeoffs in a way that lands with both engineering and finance audiences.</li>\n<li>Exceptional cross-functional communication,you earn credibility in rooms with accountants, engineers, compliance officers, and regulators, and you hold your own in all of them.</li>\n</ul>\n<p>The total rewards package at Mercury includes base salary, equity, and benefits.</p>\n<p>Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry. New hire offers are made based on a candidate’s experience, expertise, geographic location, and internal pay equity relative to peers.</p>\n<p>Our target new hire base salary ranges for this role are the following:</p>\n<ul>\n<li>US employees (any location): $200,700 - $250,900</li>\n<li>Canadian employees (any location): CAD 189,700 - 237,100</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_f5ef2a17-622","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mercury","sameAs":"https://www.mercury.com/","logo":"https://logos.yubhub.co/mercury.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mercury/jobs/5832762004","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$200,700 - $250,900 (US employees) or CAD 189,700 - 237,100 (Canadian employees)","x-skills-required":["product management","fintech","financial services","platform/infrastructure product roles","event-sourced architectures","database consistency models","double-entry accounting data models","financial data pipelines","accounting","finance","banking operations","chart of accounts","GL mapping","journal entries","reconciliation workflows"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:45:21.199Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Fintech","skills":"product management, fintech, financial services, platform/infrastructure product roles, event-sourced architectures, database consistency models, double-entry accounting data models, financial data pipelines, accounting, finance, banking operations, chart of accounts, GL mapping, journal entries, reconciliation workflows","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":189700,"maxValue":250900,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e503559e-cf7"},"title":"Senior Machine Learning Engineer","description":"<p><strong>Job Title: Senior Machine Learning Engineer</strong></p>\n<p><strong>Job Description:</strong></p>\n<p>Before 1965, it was extremely difficult and time-consuming to analyze complicated signals, like radio or images. You could solve it, but you had to throw a ton of compute at it. That all changed with the invention of the Fast Fourier transform, which could efficiently break that signal down into the frequencies that are a part of it.</p>\n<p>The Risk Onboarding team is working on efficiently reviewing customers’ applications without compromising on quality. We are the front line of defense for preventing money laundering and financial crimes, building systems to verify that someone is who they say they are and that we are allowed to do business with them.</p>\n<p><strong>About Us:</strong></p>\n<p>At Mercury, we craft an exceptional banking experience for startups. Our team is focused on ensuring our products create a safe environment that meets the needs of our customers, administrators, and regulators.</p>\n<p><strong>Job Responsibilities:</strong></p>\n<p>As part of this role, you will:</p>\n<ul>\n<li>Partner with data science &amp; engineering teams to design and deploy ML &amp; Gen AI microservices, primarily focusing on automating reviews</li>\n<li>Work with a full-stack engineering team to embed these services into the overall review experience, including human in the loop, escalations, and feeding human decisions back into the service</li>\n<li>Implement testing, observability, alerting, and disaster recovery for all services</li>\n<li>Implement tracing, performance, and regression testing</li>\n<li>Feel a strong sense of product ownership and actively seek responsibility – we often self-organize on small/medium projects, and we want someone who’s excited to help shape and build Mercury’s future</li>\n</ul>\n<p><strong>Ideal Candidate:</strong></p>\n<p>The ideal candidate for the role has:</p>\n<ul>\n<li>7+ years of experience in roles like machine learning engineering, data engineering, backend software engineering, and/or devops</li>\n<li>Expertise with:</li>\n</ul>\n<ul>\n<li>A full modern data stack: Snowflake, dbt, Fivetran, Airbyte, Dagster, Airflow</li>\n<li>SQL, dbt, Python</li>\n<li>OLAP / OLTP data modelling and architecture</li>\n<li>Key-value stores: Redis, dynamoDB, or equivalent</li>\n<li>Streaming / real-time data pipelines: Kinesis, Kafka, Redpanda</li>\n<li>API frameworks: FastAPI, Flask, etc.</li>\n<li>Production ML Service experience</li>\n<li>Working across full-stack development environment, with experience transferable to Haskell, React, and TypeScript</li>\n</ul>\n<p><strong>Total Rewards Package:</strong></p>\n<p>The total rewards package at Mercury includes base salary, equity (stock options/RSUs), and benefits. Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry. New hire offers are made based on a candidate’s experience, expertise, geographic location, and internal pay equity relative to peers.</p>\n<p><strong>Salary Range:</strong></p>\n<p>Our target new hire base salary ranges for this role are the following:</p>\n<ul>\n<li>US employees (any location): $200,700 - $250,900</li>\n<li>Canadian employees (any location): CAD 189,700 - 237,100</li>\n</ul>\n<p><strong>Diversity &amp; Belonging:</strong></p>\n<p>Mercury values diversity &amp; belonging and is proud to be an Equal Employment Opportunity employer. All individuals seeking employment at Mercury are considered without regard to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, sexual orientation, or any other legally protected characteristic.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e503559e-cf7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mercury","sameAs":"https://www.mercury.com/","logo":"https://logos.yubhub.co/mercury.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mercury/jobs/5639559004","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$200,700 - $250,900 (US) | CAD 189,700 - 237,100 (Canada)","x-skills-required":["Snowflake","dbt","Fivetran","Airbyte","Dagster","Airflow","SQL","Python","OLAP / OLTP data modelling and architecture","Redis","dynamoDB","Kinesis","Kafka","Redpanda","FastAPI","Flask","Production ML Service experience","Haskell","React","TypeScript"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:45:16.566Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Snowflake, dbt, Fivetran, Airbyte, Dagster, Airflow, SQL, Python, OLAP / OLTP data modelling and architecture, Redis, dynamoDB, Kinesis, Kafka, Redpanda, FastAPI, Flask, Production ML Service experience, Haskell, React, TypeScript","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":189700,"maxValue":250900,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6e92655b-cbb"},"title":"Senior Data Scientist - Banking","description":"<p>We&#39;re looking for a full-stack Data Scientist to support our Cards &amp; Credit roadmap, partnering closely with Product, Engineering, Design, Underwriting, and Operations to shape how our card and credit products evolve and scale.</p>\n<p>In this role, you&#39;ll apply strong analytical judgment and product intuition to help us understand customer behaviour, evaluate trade-offs, and make smart investment decisions across the cards and lending lifecycles , from eligibility and activation to spend, retention, incentives, and credit performance. You&#39;ll help build a data-informed culture across Mercury so teams can move quickly, measure what matters, and invest intelligently.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Bringing impeccable communication and complete ownership , independently identifying opportunities, developing strong points of view, and influencing executives, Cards &amp; Credit leaders, and cross-functional partners through clear, concise, and persuasive storytelling.</li>\n<li>Developing a nuanced understanding of cardholder behaviour and economics, helping teams reason about trade-offs between growth, engagement, risk, and unit economics.</li>\n<li>Defining, owning, and analysing metrics that inform both tactical decisions and long-term strategy across the cards and credit lifecycle (e.g., eligibility, activation, spend, utilisation, rewards, retention, loss signals).</li>\n<li>Designing and evaluating experiments using rigorous statistical approaches, including A/B testing, cohort analysis, causal inference techniques, and trend analysis.</li>\n<li>Building and improving data pipelines and tools to streamline data collection, processing, and analysis workflows, ensuring the integrity, reliability, and security of data assets.</li>\n<li>Building and deploying predictive models to forecast key outcomes, inform product treatments, and deepen understanding of causal drivers.</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>7+ years of experience working with large datasets to drive product or business impact in data science or analytics roles.</li>\n<li>Fluency in SQL and comfort with Python.</li>\n<li>Strong judgment in defining and analysing product metrics, running experiments, and translating ambiguous questions into structured analyses.</li>\n<li>Exceptional proactivity and independence , identifying opportunities, forming strong points of view, and making your case to stakeholders.</li>\n<li>Experience with ETL processes and modern data modelling (e.g., dbt, dimensional models, Airflow), with a solid understanding of how data is produced and consumed.</li>\n<li>Experience in analytical approaches ranging from behavioural modelling to experimentation to optimisation , and, importantly, know when simpler approaches are the right answer.</li>\n<li>Apply AI tools to accelerate analytical and business workflows, improving scalability, decision quality, and reducing manual or repetitive work across teams.</li>\n</ul>\n<p>Nice to have:</p>\n<ul>\n<li>Experience working on cards or credit products, with familiarity in card economics and lifecycle concepts (e.g., spend behaviour, interchange, rewards and incentives, utilisation, credit limits, retention).</li>\n<li>Experience developing quantitative pricing models or engines (e.g., dynamic pricing, incentive optimisation, or marketplace pricing systems).</li>\n<li>Experience applying optimisation techniques to resource allocation or decision systems (e.g., customer operations, capacity planning, or policy optimisation).</li>\n<li>Experience building or supporting credit models, including probability of default modelling, cashflow modelling, or dynamic credit limit setting.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6e92655b-cbb","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mercury","sameAs":"https://www.mercury.com/","logo":"https://logos.yubhub.co/mercury.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mercury/jobs/5799320004","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$200,700 - $250,900 USD","x-skills-required":["SQL","Python","ETL processes","modern data modelling","A/B testing","cohort analysis","causal inference techniques","trend analysis","data pipelines","predictive models"],"x-skills-preferred":["cardholder behaviour and economics","quantitative pricing models","optimisation techniques","credit models","probability of default modelling","cashflow modelling","dynamic credit limit setting"],"datePosted":"2026-04-17T12:45:16.180Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"SQL, Python, ETL processes, modern data modelling, A/B testing, cohort analysis, causal inference techniques, trend analysis, data pipelines, predictive models, cardholder behaviour and economics, quantitative pricing models, optimisation techniques, credit models, probability of default modelling, cashflow modelling, dynamic credit limit setting","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":200700,"maxValue":250900,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_782f7e7f-e0e"},"title":"Revenue Technology - Data Strategy & Operations Lead","description":"<p>We&#39;re looking for a Data Strategy &amp; Operations leader to own the data foundations that power revenue execution. This role ensures that revenue data is reliable, interpretable, scalable, and usable as the business evolves and that teams can act on what they see with confidence.</p>\n<p>In this role, you will report to the Head of Platforms &amp; Infrastructure and play a central role in shaping how Mercury models, governs, and operationalizes GTM data. You’ll partner closely with Data Engineering, Data Science, Solution Architecture, Platform Engineering, etc.</p>\n<p>Some key responsibilities include:</p>\n<ul>\n<li>Owning the definition, structure, and reliability of data originating from revenue platforms (e.g., Salesforce, GTM tools, automation systems)</li>\n<li>Serving as the primary decision owner for GTM-sourced tables and views used for revenue execution, forecasting inputs, lifecycle tracking, and signal-based workflows</li>\n<li>Designing and evolving core GTM data models across Salesforce, ETL, and analytics layers</li>\n<li>Partnering with Data Engineering to align GTM schemas with enterprise data models and define clear data contracts between source systems and downstream consumers</li>\n<li>Partnering with Data Science / Analytics to ensure revenue data is interpretable, statistically sound, and reflects how the business actually operates</li>\n<li>Owning clarity around data ownership boundaries, shared dependencies, and escalation paths when upstream or downstream changes impact revenue integrity</li>\n<li>Defining and upholding data quality, freshness, consistency, and documentation standards for revenue platforms</li>\n<li>Monitoring and improving pipeline reliability, performance, and scalability, proactively identifying fragile or redundant transformations</li>\n<li>Identifying opportunities to automate manual or error-prone data workflows and reduce operational overhead</li>\n<li>Acting as a data thought partner to Platforms &amp; Infrastructure, Revenue Operations, Analytics, and Security , advising on feasibility, tradeoffs, and sequencing for data-heavy initiatives</li>\n</ul>\n<p>You should have:</p>\n<ul>\n<li>7+ years of experience in data engineering or data systems roles within SaaS or technology companies</li>\n<li>Deep experience designing and operating production data pipelines</li>\n<li>Highly proficient in SQL and experienced in data modeling</li>\n<li>Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery, Redshift)</li>\n<li>Experience with ETL / ELT tooling (e.g., dbt, Airflow, Census, or similar)</li>\n<li>Understanding of Salesforce data models and common GTM system architectures</li>\n<li>Ability to translate business concepts into durable, well-structured data models</li>\n<li>Clear communication skills with both technical and non-technical partners</li>\n</ul>\n<p>Preferred qualifications include:</p>\n<ul>\n<li>Experience supporting revenue, sales, or customer lifecycle data</li>\n<li>Familiarity with event-based data platforms (e.g., Data Cloud or equivalents)</li>\n<li>Experience working alongside platform engineering and security teams</li>\n<li>Exposure to data governance, access controls, and compliance considerations</li>\n<li>Experience mentoring or guiding other data practitioners</li>\n</ul>\n<p>The total rewards package at Mercury includes base salary, equity, and benefits. Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_782f7e7f-e0e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mercury","sameAs":"https://www.mercury.com/","logo":"https://logos.yubhub.co/mercury.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mercury/jobs/5806201004","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$142,600 - $198,000","x-skills-required":["SQL","data modeling","modern data stacks","ETL/ELT tooling","Salesforce data models","GTM system architectures"],"x-skills-preferred":["event-based data platforms","data governance","access controls","compliance considerations","mentoring/guiding other data practitioners"],"datePosted":"2026-04-17T12:44:37.206Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, data modeling, modern data stacks, ETL/ELT tooling, Salesforce data models, GTM system architectures, event-based data platforms, data governance, access controls, compliance considerations, mentoring/guiding other data practitioners","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":142600,"maxValue":198000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3048ccd4-7de"},"title":"Data Analyst","description":"<p>We are seeking a Data Analyst to join our growing data team. As a Data Analyst at LayerZero, you will be at the forefront of shaping a rich data foundation for a company making a real impact in the web3 space. You will work closely with teams and leaders to uncover insights, drive decision-making, and fuel our next-generation products and services.</p>\n<p>The successful candidate will dive headfirst into the world of crypto data, exploring on-chain wallets and contracts, block and transaction data, insights from in-house systems, and third-party intelligence. Your mission will be to combine these diverse datasets into rich, actionable data products for a broad group of stakeholders.</p>\n<p>Key responsibilities include:\nLeveraging and expanding our ever-growing Kimball dimensional model.\nWriting SQL to create and expand insights in our in-house reporting solutions.\nCollaborating with stakeholders across the organization to conduct ad-hoc explorations and analytics.\nBeing a key owner of data quality, building out insights that serve the data team itself.\nComposing pipelines by writing SQL code to clean, combine, refine, and aggregate data into the insights the organization needs.\nCollaborating on new datasets to ingest into our Snowflake data warehouse, working closely with data engineers on your team.\nNot afraid of pushing code that supports tens of billions of dollars in daily transaction volume.</p>\n<p>We are looking for someone with previous data analyst experience, likely with a bachelor&#39;s degree in Computer Science, Statistics, Mathematics, Physics or related field, but we also consider and highly value equivalent practical experience.</p>\n<p>Required skills include strong SQL knowledge and experience, proven track record in data modeling, statistics, and analytics, experience working with a broad range of stakeholders, and strong convictions weakly held.\nNice to have skills include experience with general programming, experience with Snowflake, experience building DAG-based data pipelines, experience with streaming real-time data pipelines, previous experience with blockchain technologies, smart contracts, and decentralized finance, experience with Kimball dimensional modeling, and working on a mid-to-large scale data stacks.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3048ccd4-7de","directApply":true,"hiringOrganization":{"@type":"Organization","name":"LayerZero","sameAs":"https://layerzero.com/","logo":"https://logos.yubhub.co/layerzero.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/layerzerolabs/jobs/5787956004","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","statistics","analytics","Snowflake","Kimball dimensional modeling"],"x-skills-preferred":["general programming","DAG-based data pipelines","streaming real-time data pipelines","blockchain technologies","smart contracts","decentralized finance"],"datePosted":"2026-04-17T12:41:37.110Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Vancouver, BC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, statistics, analytics, Snowflake, Kimball dimensional modeling, general programming, DAG-based data pipelines, streaming real-time data pipelines, blockchain technologies, smart contracts, decentralized finance"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8a8c0eb9-6e6"},"title":"Data Scientist, Product","description":"<p><strong>Job Title: Data Scientist, Product</strong></p>\n<p>This is the founding hire for product analytics at Hebbia. As a data scientist, you will define what our core product metrics are: what counts as an active user, what engagement actually means, what signals correlate with retention.</p>\n<p>This is not a dashboarding role. The goal is to shape product decisions with data, not just report on them. You will identify which workflows drive repeat usage, where users drop off, what features move engagement, and what differentiates power users from casual users across our enterprise customer base.</p>\n<p>The role sits at the intersection of analytics engineering, product analytics, and data science. You will build the infrastructure and do the analysis. Define the metrics, build the pipelines, create the dashboards, and use what you built to inform the roadmap.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Define and implement Hebbia&#39;s core product metrics from scratch: active users, engagement, retention, feature adoption, account health. Build the canonical definitions the entire company uses.</li>\n<li>Design and build the product analytics infrastructure: fact tables, clean data models, and the analytics layer that sits on top of our product data.</li>\n<li>Build and maintain executive and product dashboards that leadership and product teams use to make decisions.</li>\n<li>Write DAGs, transforms, and data pipelines that support analytics. Work with engineering to instrument the product so usage data is captured correctly.</li>\n<li>Analyze customer behavior across our B2B customer base: account-level usage patterns, workflow adoption, expansion signals, and churn risk indicators.</li>\n<li>Inform the product roadmap using data. Identify friction in user flows, surface feature adoption patterns, and highlight opportunities for product improvement.</li>\n<li>Partner with product managers and engineers to translate product questions into measurable data and structured experiments.</li>\n<li>Establish data quality standards and documentation so the metrics layer you build is trusted and maintained.</li>\n</ul>\n<p><strong>Who You Are</strong></p>\n<ul>\n<li>3+ years of experience in product analytics, analytics engineering, or data science at a B2B SaaS company or high-growth startup</li>\n<li>Strong in SQL and Python. You can write production-quality transforms, not just ad hoc queries.</li>\n<li>Experience with modern data stack tools: dbt, Airflow, Snowflake, BigQuery, or similar. You understand data modeling and warehouse architecture.</li>\n<li>You have built dashboards and reporting that product teams and leadership actually use to make decisions</li>\n<li>You understand B2B product analytics: account-level metrics, multi-user workflows, enterprise engagement patterns, and why B2B retention analysis is different from consumer</li>\n<li>You translate ambiguous product questions into structured analyses. You do not wait for someone to hand you a spec.</li>\n<li>Strong product intuition. You care about why users behave the way they do, not just what the numbers say.</li>\n<li>Clear communicator. You can present findings to engineers, product managers, and executives with equal effectiveness.</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>The salary range for this position is set between $180,000 to $260,000. This range may be inclusive of several career levels at Hebbia and will be narrowed during the interview process based on the candidate’s experience and qualifications.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8a8c0eb9-6e6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Hebbia","sameAs":"https://hebbia.com/","logo":"https://logos.yubhub.co/hebbia.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/hebbia/jobs/4670090005","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$180,000 - $260,000","x-skills-required":["SQL","Python","dbt","Airflow","Snowflake","BigQuery","data modeling","warehouse architecture","product analytics","analytics engineering","data science"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:37:39.339Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City; San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, Python, dbt, Airflow, Snowflake, BigQuery, data modeling, warehouse architecture, product analytics, analytics engineering, data science","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":260000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_58df2f04-af4"},"title":"Data Engineer","description":"<p>We are looking for a Data Engineer to join our Data Platform team to partner with our product and business stakeholders across risk, operations, and other domains. As a Data Engineer, you will be responsible for building robust data pipelines and engineering foundations by ingesting data from disparate sources, ensuring data quality and consistency, and enabling better business decisions through reliable data infrastructure across core product areas.</p>\n<p>Your primary focus will be on building scalable data pipelines using Airflow to orchestrate data workflows that ingest, transform, and deliver data from various sources into Snowflake and Databricks. You will also design and implement data models in Snowflake that support analytics, reporting, and ML use cases with a focus on performance, reliability, and scalability.</p>\n<p>In addition, you will develop infrastructure as code using Terraform to automate and manage cloud resources in AWS, ensuring consistent and reproducible deployments. You will monitor data pipeline health and implement data quality checks to ensure accuracy, completeness, and timeliness of data as business needs evolve.</p>\n<p>You will also optimize data processing workflows to improve performance, reduce costs, and handle growing data volumes efficiently. Troubleshooting and resolving data pipeline issues, working through ambiguity to get to the root cause and implementing long-term fixes will be a key part of your role.</p>\n<p>As a Data Engineer, you will bridge gaps between data and the business by working with cross-functional teams across the US and India office to understand requirements and translate them into robust technical solutions. You will create comprehensive documentation on data pipelines, data models, and infrastructure, keeping documentation up to date and facilitating knowledge transfer across the team.</p>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>2+ years of data engineering experience with strong technical skills and the ability to architect scalable data solutions.</li>\n</ul>\n<ul>\n<li>Hands-on experience with Python for data processing, automation, and building data pipelines.</li>\n</ul>\n<ul>\n<li>Proficiency with workflow orchestration tools, preferably Airflow, including DAG development, task dependencies, and monitoring.</li>\n</ul>\n<ul>\n<li>Strong SQL skills and experience with cloud data warehouses like Snowflake, including performance optimization and data modeling.</li>\n</ul>\n<ul>\n<li>Experience with cloud platforms, preferably AWS (S3, Lambda, EC2, IAM, etc.), and understanding of cloud-based data architectures.</li>\n</ul>\n<ul>\n<li>Experience working cross-functionally with data analysts, analytics engineers, data scientists, and business stakeholders to understand requirements and deliver solutions.</li>\n</ul>\n<ul>\n<li>An ownership mentality – this engineer will be responsible for the reliability and performance of their data pipelines and expected to fully understand data flows, dependencies, and their implications on downstream users.</li>\n</ul>\n<p><strong>Nice to have:</strong></p>\n<ul>\n<li>Experience with dbt for transformation logic and analytics engineering workflows integrated with data pipelines.</li>\n</ul>\n<ul>\n<li>Familiarity with Databricks for large-scale data processing, including Spark optimization and Delta Lake.</li>\n</ul>\n<ul>\n<li>Experience with Infrastructure as Code (IaC) tools like Terraform for managing cloud resources and data infrastructure.</li>\n</ul>\n<ul>\n<li>Knowledge of data modeling concepts (e.g., dimensional modeling, star/snowflake schemas, slowly changing dimensions).</li>\n</ul>\n<ul>\n<li>Experience with CI/CD practices for data pipelines and automated testing frameworks.</li>\n</ul>\n<ul>\n<li>Experience with streaming data and real-time processing frameworks</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_58df2f04-af4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Greenlight","sameAs":"https://www.greenlight.com/","logo":"https://logos.yubhub.co/greenlight.com.png"},"x-apply-url":"https://jobs.lever.co/greenlight/e98d9733-8b8c-4ce4-997d-6cf14e35b2f3","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Airflow","Python","SQL","Snowflake","Databricks","AWS","Terraform","data engineering","data pipelines","data modeling"],"x-skills-preferred":["dbt","Infrastructure as Code","CI/CD","streaming data","real-time processing"],"datePosted":"2026-04-17T12:36:30.660Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Airflow, Python, SQL, Snowflake, Databricks, AWS, Terraform, data engineering, data pipelines, data modeling, dbt, Infrastructure as Code, CI/CD, streaming data, real-time processing"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_faec8dc3-4d3"},"title":"Senior Machine Learning Scientist","description":"<p>We are seeking a Senior Machine Learning Scientist to help grow the Machine Learning Science team. The ideal candidate has a strong knowledge of artificial intelligence (AI), including machine learning (ML) fundamentals and extensive experience with deep learning (DL) methods. They will be responsible for the development of algorithms for early, blood-based detection tests for cancer. They will build on a foundation of ML/DL and statistical skills to develop models for identifying molecular signals from blood. They will also work with computational biologists, molecular biologists and ML engineers to design and drive research experiments, and will have a significant impact on the continued growth of an organisation dedicated to changing the entire landscape of cancer.</p>\n<p>The role reports to the Director, Machine Learning Science. This role can be a Hybrid role based in our Brisbane, California headquarters (2-3 days per week in office), or remote.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Independently pursuing cutting-edge research in AI applied to biological problems</li>\n<li>Building new models or fine-tuning existing models to identify biological changes resulting from disease</li>\n<li>Building models that achieve high accuracy and that generalise robustly to new data</li>\n<li>Applying contemporary interpretability techniques to provide a deeper understanding of the underlying signal identified by the model, ideally suggesting potential biological mechanisms</li>\n<li>Working closely with ML Engineering partners to ensure that Freenome&#39;s computational infrastructure supports optimal model training and iteration</li>\n<li>Taking a mindful, transparent, and humane approach to your work</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>PhD or equivalent research experience with an AI emphasis and in a relevant, quantitative field such as Computer Science, Statistics, Mathematics, Engineering, Computational Biology, or Bioinformatics</li>\n<li>3+ years of postdoc or post-PhD industry experience achieving impactful results using relevant modelling techniques</li>\n<li>Expertise, demonstrated by research publications or industry achievements, in applied machine learning, deep learning and complex data modelling</li>\n<li>Practical and theoretical understanding of fundamental ML models like generalised linear models, kernel machines, decision trees and forests, neural networks</li>\n<li>Practical and theoretical understanding of DL models like large language models or other foundation models</li>\n<li>Extensive experience with training paradigms like supervised learning, self-supervised learning, and contrastive learning</li>\n<li>Proficient in current state of the art in ML/DL approaches in different domains, with an ability to envision their applications in biological data</li>\n<li>Proficiency in a general-purpose programming language: Python, R, Java, C, C++, etc.</li>\n<li>Proficiency in one or more ML frameworks such as; Pytorch, Tensorflow and Jax; and ML platforms like Hugging Face</li>\n<li>Experience in ML analysis and developer tools like TensorBoard, MLflow or Weights &amp; Biases</li>\n<li>Excellent ability to communicate across disciplines, work collaboratively, and make progress in smaller steps via experimental iterations</li>\n<li>A passion for innovation and demonstrated initiative in tackling new areas of research</li>\n</ul>\n<p>Nice to have qualifications include:</p>\n<ul>\n<li>Deep domain-specific experience in computational biology, genomics, proteomics or a related field</li>\n<li>Experience in building DL models for genomic data, with knowledge of state-of-the-art DNA foundation models</li>\n<li>Experience in NGS data analysis and bioinformatic pipelines</li>\n<li>Experience with containerized cloud computing environments such as Docker in GCP, Azure, or AWS</li>\n<li>Experience in a production software engineering environment, including the use of automated regression testing, version control, and deployment systems</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_faec8dc3-4d3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Freenome","sameAs":"https://freenome.com","logo":"https://logos.yubhub.co/freenome.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/freenome/jobs/7963050002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$173,775 - $246,750","x-skills-required":["PhD or equivalent research experience","Applied machine learning","Deep learning","Complex data modelling","Generalised linear models","Kernel machines","Decision trees and forests","Neural networks","Large language models","Supervised learning","Self-supervised learning","Contrastive learning","Python","R","Java","C","C++","Pytorch","Tensorflow","Jax","Hugging Face","TensorBoard","MLflow","Weights & Biases"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:35:12.037Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Brisbane, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"PhD or equivalent research experience, Applied machine learning, Deep learning, Complex data modelling, Generalised linear models, Kernel machines, Decision trees and forests, Neural networks, Large language models, Supervised learning, Self-supervised learning, Contrastive learning, Python, R, Java, C, C++, Pytorch, Tensorflow, Jax, Hugging Face, TensorBoard, MLflow, Weights & Biases","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":173775,"maxValue":246750,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_369c1543-2c5"},"title":"Senior Full Stack Engineer - Conversation Intelligence","description":"<p>Join us on this thrilling journey to revolutionize the workforce with AI. The future of work is here, and it&#39;s at Cresta.</p>\n<p>The QM &amp; Coaching Team at Cresta is vital for our post-call intelligence products. Our core mission is to utilize Large Language Models (LLMs) and advanced AI techniques to transform conversations into actionable intelligence, streamlining the process of agent performance review and coaching. We build systems that deeply analyze agent behaviors, delivering structured insights into their performance and pinpointing critical areas for improvement. Through AI-powered exploration and interactive workflows, our work empowers organizations to leverage data-driven decisions and enhance the overall customer experience.</p>\n<p>As a Senior Fullstack Engineer, you’ll play a key role in building and scaling the no-code platform that powers Cresta’s processing capabilities. This platform empowers non-technical users to configure conversation workflows, apply automation without writing code. You will work across the stack,from building intuitive UIs to robust backend services,enabling customers to unlock value from conversations quickly and flexibly.</p>\n<p>You’ll partner with designers, product managers and AI/ML teams to turn complex requirements into delightful and performant product experiences. You’ll also help shape the future of our no-code architecture, ensuring it scales with our product and customer base.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, develop, and maintain end-to-end features for Cresta’s no-code processing platform.</li>\n<li>Build intuitive UI components and visual editors for configuring conversation logic and workflows.</li>\n<li>Architect and implement backend services and APIs to power a dynamic no-code interface.</li>\n<li>Work closely with ML engineers to expose conversation intelligence in an accessible and configurable way.</li>\n<li>Develop data models and storage layers using Postgres, ClickHouse, and Elasticsearch.</li>\n<li>Identify areas for performance improvements and scalability in both frontend and backend systems.</li>\n<li>Ensure reliability, security, and maintainability across the full technology stack.</li>\n<li>Participate in design discussions, code reviews, and continuous integration processes.</li>\n</ul>\n<p>Qualifications:</p>\n<ul>\n<li>Proven experience as a Senior Fullstack Engineer, with strong frontend and backend contributions to no-code or low-code platforms.</li>\n<li>Deep understanding of modern frontend technologies (React, TypeScript, etc.) and design patterns for building complex UIs.</li>\n<li>Backend expertise in Python, Go, or similar languages; experience with RESTful APIs and microservices architecture.</li>\n<li>Strong foundation in database systems and data modeling; hands-on experience with Postgres, ClickHouse, or Elasticsearch is a plus.</li>\n<li>Experience building platforms or tools for non-technical users, especially in AI/ML, automation, or workflow spaces.</li>\n<li>Familiarity with managing state, execution logic, or visual programming paradigms is a bonus.</li>\n<li>Strong collaboration and communication skills,able to work effectively across product, design, and engineering teams.</li>\n<li>Passion for building systems that simplify complex problems and empower users.</li>\n</ul>\n<p>Perks &amp; Benefits:</p>\n<ul>\n<li>Comprehensive medical, dental, and vision coverage with plans to fit you and your family</li>\n<li>Flexible PTO to take the time you need, when you need it</li>\n<li>Paid parental leave for all new parents welcoming a new child</li>\n<li>Retirement savings plan to help you plan for the future</li>\n<li>Remote work setup budget to help you create a productive home office</li>\n<li>Monthly wellness and communication stipend to keep you connected and balanced</li>\n<li>In-office meal program and commuter benefits provided for onsite employees</li>\n</ul>\n<p>Compensation at Cresta:</p>\n<ul>\n<li>Cresta’s approach to compensation is simple: recognize impact, reward excellence, and invest in our people. We offer competitive, location-based pay that reflects the market and what each individual brings to the table.</li>\n<li>The posted base salary range represents what we expect to pay for this role in a given location. Final offers are shaped by factors like experience, skills, education, and geography. In addition to base pay, total compensation includes equity and a comprehensive benefits package for you and your family.</li>\n<li>OTE Range: $205,000–$270,000 + Offers Equity</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_369c1543-2c5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Cresta","sameAs":"https://www.cresta.ai/","logo":"https://logos.yubhub.co/cresta.ai.png"},"x-apply-url":"https://job-boards.greenhouse.io/cresta/jobs/5026013008","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Modern frontend technologies (React, TypeScript, etc.)","Design patterns for building complex UIs","Backend expertise in Python, Go, or similar languages","RESTful APIs and microservices architecture","Database systems and data modeling"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:26:53.398Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States (Remote)"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Modern frontend technologies (React, TypeScript, etc.), Design patterns for building complex UIs, Backend expertise in Python, Go, or similar languages, RESTful APIs and microservices architecture, Database systems and data modeling"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e231d72c-b82"},"title":"Senior Software Engineer, Backend (Berlin)","description":"<p>Join us on this thrilling journey to revolutionize the contact center workforce with AI. As a Senior full-stack engineer, with a backend focus, you will be at the forefront of shaping the future of customer engagement! You&#39;ll be instrumental in delivering timely, actionable insights that drive business growth from day one.</p>\n<p>We&#39;re building a state-of-the-art Customer Data Platform, visualizing relevant insights for businesses post-onboarding and guiding customer engagement across all touch-points. Be part of the team that&#39;s redefining the way businesses connect with their customers!</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Design, implement, and maintain backend services and APIs to support applications.</li>\n<li>Build and optimize data storage solutions using Postgres, ClickHouse, and Elasticsearch to ensure high performance and scalability.</li>\n<li>Collaborate with cross-functional teams, including frontend engineers, data scientists, and machine learning engineers, to deliver end-to-end solutions.</li>\n<li>Monitor and troubleshoot performance issues in distributed systems and databases.</li>\n<li>Write clean, maintainable, and efficient code following best practices for backend development.</li>\n<li>Participate in code reviews, testing, and continuous integration efforts.</li>\n<li>Ensure security, scalability, and reliability of backend services.</li>\n<li>Analyze and improve system architecture, focusing on performance bottlenecks, scaling, and security.</li>\n</ul>\n<p><strong>Qualifications We Value:</strong></p>\n<ul>\n<li>Proven experience as a Backend Engineer with a focus on database design and system architecture.</li>\n<li>Strong expertise in ClickHouse or similar columnar databases for managing large-scale, real-time analytical queries.</li>\n<li>Hands-on experience with Elasticsearch for indexing and searching large datasets.</li>\n<li>Proficient in backend programming languages such as Python, Go.</li>\n<li>Experience with RESTful API design and development.</li>\n<li>Solid understanding of distributed systems, microservices architecture, and cloud infrastructure.</li>\n<li>Experience with performance tuning, data modeling, and query optimization.</li>\n<li>Strong problem-solving skills and attention to detail.</li>\n<li>Excellent communication and teamwork abilities.</li>\n</ul>\n<p><strong>Perks &amp; Benefits:</strong></p>\n<ul>\n<li>Paid parental leave to support you and your family</li>\n<li>Monthly Health &amp; Wellness allowance</li>\n<li>Work from home office stipend to help you succeed in a remote environment</li>\n<li>Lunch reimbursement for in-office employees</li>\n<li>PTO: 28 days in Germany</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e231d72c-b82","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Cresta","sameAs":"https://www.cresta.ai/","logo":"https://logos.yubhub.co/cresta.ai.png"},"x-apply-url":"https://job-boards.greenhouse.io/cresta/jobs/4668107008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Postgres","ClickHouse","Elasticsearch","Python","Go","RESTful API design and development","Distributed systems","Microservices architecture","Cloud infrastructure","Performance tuning","Data modeling","Query optimization"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:26:29.315Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Berlin, Germany (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Postgres, ClickHouse, Elasticsearch, Python, Go, RESTful API design and development, Distributed systems, Microservices architecture, Cloud infrastructure, Performance tuning, Data modeling, Query optimization"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_408a1b9e-7b9"},"title":"Product Lead, Financial Infrastructure, Custody-as-a-Service","description":"<p>At Anchorage Digital, we are building the world&#39;s most advanced digital asset platform for institutions to participate in crypto. As the Product Lead, you will own the vision, strategy, and execution of this platform from the ground up , defining how the next generation of retail brokerages, remittance platforms, and digital wallets will build their services entirely on Anchorage.</p>\n<p>This is a rare opportunity to shape a high-impact platform product at the intersection of regulated finance, crypto-native infrastructure, and developer experience , backed by the credibility and scale of Anchorage Digital, trusted by the world&#39;s largest financial institutions.</p>\n<p><strong>Technical Skills:</strong></p>\n<ul>\n<li>Drive the detailed prioritization of the CaaS platform roadmap, balancing foundational infrastructure work, partner-facing features, compliance requirements, and developer experience.</li>\n<li>Build comprehensive go-to-market strategy and execution, including defining success metrics, tracking adoption KPIs, and iterating based on partner feedback and data.</li>\n<li>Build deep understanding of system architecture, including multi-tenant platform design, API gateway patterns, ledger systems, and event-driven architectures, and communicate clear requirements to engineering.</li>\n</ul>\n<p><strong>Complexity and Impact of Work:</strong></p>\n<ul>\n<li>Lead and influence cross-functional teams across Product, Engineering, Design, Compliance, Legal, Security, and GTM while maintaining strong stakeholder relationships.</li>\n<li>Manage independent decision-making through ambiguity inherent in scaling a platform and take full ownership of the CaaS product line and its growth.</li>\n<li>Demonstrate product leadership that elevates team performance, attracts top talent, and establishes Anchorage as the definitive infrastructure-as-a-service provider in digital assets.</li>\n</ul>\n<p><strong>Organizational Knowledge:</strong></p>\n<ul>\n<li>Deliver company objectives through strategic product decisions that unlock new partner revenue while strengthening Anchorage&#39;s regulatory moat.</li>\n<li>Actively contribute to organizational strategy by identifying how CaaS capabilities can also benefit direct institutional clients.</li>\n</ul>\n<p><strong>Communication and Influence:</strong></p>\n<ul>\n<li>Effectively influence and motivate across all levels of the organization, this role requires alignment from founders through to individual engineers.</li>\n<li>Serve as the internal and external evangelist for the CaaS platform, presenting to partners, at industry events, and to Anchorage leadership.</li>\n<li>Act as a trusted thought partner to senior leadership on platform strategy, competitive positioning, and market opportunity.</li>\n</ul>\n<p><strong>You may be a fit for this role if you have:</strong></p>\n<ul>\n<li>7+ years of product management experience, with significant experience building B2B2C, payment or platform/infrastructure products.</li>\n<li>Proven track record building 0-to-1 platform products , ideally in fintech, banking-as-a-service, payments, or crypto infrastructure.</li>\n<li>Deep understanding of platform business models , including API-first products, developer ecosystems, multi-tenant architectures, and partner enablement.</li>\n<li>Experience with regulated financial services , you understand the interplay between product velocity and compliance requirements (AML/KYC, bank regulations, money transmission).</li>\n<li>Strong technical fluency , you can go deep on API design, system architecture, ledger systems, and data models, and you earn credibility with engineering teams.</li>\n<li>Experience defining and shipping compliance-embedded products , where regulatory requirements are not a bolt-on but a core part of the product value proposition.</li>\n<li>You self-describe as some combination of the following: creative, humble, ambitious, detail oriented, hard working, trustworthy, eager to learn, methodical, action oriented, and tenacious.</li>\n</ul>\n<p><strong>Although not a requirement, bonus points if you have:</strong></p>\n<ul>\n<li>Built or led a banking-as-a-service, custody-as-a-service, or payments-as-a-service platform.</li>\n<li>Direct experience with crypto custody, digital asset wallets, or blockchain infrastructure.</li>\n<li>Worked on multi-jurisdictional product launches involving complex regulatory coordination.</li>\n<li>Background in engineering, equipping you with the acumen to effectively collaborate with technical teams.</li>\n<li>Experience with white-label or embedded finance products.</li>\n<li>Seen and were emotionally moved by the musical Hamilton :)</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_408a1b9e-7b9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anchorage Digital","sameAs":"https://anchorage.com","logo":"https://logos.yubhub.co/anchorage.com.png"},"x-apply-url":"https://jobs.lever.co/anchorage/d921c32e-7ad6-40f9-841a-3e4f11789af0","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["product management","platform business models","API-first products","developer ecosystems","multi-tenant architectures","partner enablement","regulated financial services","compliance requirements","API design","system architecture","ledger systems","data models"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:25:06.996Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"product management, platform business models, API-first products, developer ecosystems, multi-tenant architectures, partner enablement, regulated financial services, compliance requirements, API design, system architecture, ledger systems, data models"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1c431665-20b"},"title":"Data Governance and Management Lead","description":"<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto. We are seeking a Data Governance &amp; Management Lead within the Global Analytics team to help develop and implement data controls, data quality standards, and governance practices across the platform.</p>\n<p>This role supports data integrity, metadata, and access controls to help ensure data is accurate, consistent, and fit for purpose. This is a hands-on role that requires strong technical fluency, structured problem-solving, and the ability to translate governance requirements into practical implementations within data systems.</p>\n<p><strong>Technical Skills:</strong></p>\n<ul>\n<li>Working knowledge of data governance, data management, and data quality frameworks</li>\n<li>Experience supporting the implementation of data controls within data pipelines and reporting systems</li>\n<li>Advanced proficiency in SQL, Python, or other data query and analysis tools</li>\n<li>Proficiency with business intelligence and data visualization tools such as Looker, Power BI, or Tableau</li>\n<li>Experience with database design, including understanding complex data schemas and data extraction</li>\n<li>Familiarity with data lineage, metadata management, and data modeling concepts</li>\n<li>Ability to define and implement data quality rules and validation checks</li>\n<li>Understanding of data access principles, including role-based access and data classification</li>\n<li>Ability to document data processes and controls clearly and in a structured way</li>\n</ul>\n<p><strong>Complexity and Impact of Work:</strong></p>\n<ul>\n<li>Oversee the data governance program, identify improvement areas, and implement best practices to enhance data quality, integrity, and security</li>\n<li>Develop and implement data quality standards and monitoring processes, including establishing data quality metrics and thresholds</li>\n<li>Assist in managing the data issue lifecycle, including tracking and supporting remediation efforts</li>\n<li>Manage the data governance platform (Atlan) and serve as the primary subject matter expert</li>\n<li>Assist in data classification efforts, including identifying and categorizing sensitive data and critical data elements</li>\n<li>Manage external data requests, including regulatory inquiries, ensuring compliance with banking regulations</li>\n<li>Monitor and report on key data governance metrics and KPIs, providing insights and recommendations to senior management</li>\n<li>Lead data governance meetings and workshops, facilitating discussions and decision-making to drive the data governance program forward</li>\n</ul>\n<p><strong>Organizational Knowledge:</strong></p>\n<ul>\n<li>Have a deep understanding of Anchorage Digital’s strategy and business lines.</li>\n<li>Understand how data supports decision-making and operational processes across the organization</li>\n<li>Possess strategic thinking and vision, with the ability to develop and implement a comprehensive data governance strategy aligned with organizational goals and objectives</li>\n</ul>\n<p><strong>Communication and Influence:</strong></p>\n<ul>\n<li>Able to communicate complex issues clearly and credibly to a wide range of audiences.</li>\n<li>Document data processes, controls, and findings clearly for internal stakeholders</li>\n<li>Build effective relationships and rapport with stakeholders, including cross-functional and external partners</li>\n<li>Communicate, organize, and execute cross-team goals and projects, leveraging relationships and resources to solve problems</li>\n<li>Collaborate with Data Platform, InfoSec, Product, and Engineering partners</li>\n</ul>\n<p><strong>You may be a fit for this role if you have:</strong></p>\n<ul>\n<li>Bachelor’s degree required. Advanced degrees or certifications in data analytics or governance preferred</li>\n<li>4–7 years of experience in data governance, data management, data quality, or data analytics</li>\n<li>Hands-on experience implementing or supporting data quality and governance practices</li>\n<li>Experience managing data classification, access controls, and external data requests</li>\n<li>Experience working with data pipelines, reporting systems, or analytical datasets</li>\n<li>Experience writing, editing, or reviewing technical documentation for regulatory or banking contexts</li>\n<li>Strong attention to detail, with a focus on accuracy, completeness, and consistency in data governance processes and controls</li>\n<li>Ability to work independently on defined tasks and contribute to team objectives</li>\n<li>Strong problem-solving skills and comfort working in structured, detail-oriented environments</li>\n</ul>\n<p><strong>Although not a requirement, bonus points if:</strong></p>\n<ul>\n<li>You&#39;ve kept up to date with the proliferation of blockchain and crypto innovations.</li>\n<li>You were emotionally moved by the soundtrack to Hamilton, which chronicles the founding of a new financial system. :)</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1c431665-20b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anchorage Digital","sameAs":"https://anchorage.com","logo":"https://logos.yubhub.co/anchorage.com.png"},"x-apply-url":"https://jobs.lever.co/anchorage/5bfbd64c-933e-418c-9c07-5aea50212c0d","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data governance","data management","data quality frameworks","SQL","Python","Looker","Power BI","Tableau","database design","data lineage","metadata management","data modeling","data access principles","role-based access","data classification"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:22:29.501Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"data governance, data management, data quality frameworks, SQL, Python, Looker, Power BI, Tableau, database design, data lineage, metadata management, data modeling, data access principles, role-based access, data classification"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3d849fbc-058"},"title":"Member of Product, Data Platform","description":"<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto.</p>\n<p>The Data Platform team is the backbone of Anchorage Digital&#39;s information infrastructure. As data becomes the lifeblood of every product, compliance workflow, and client-facing report we produce, this team is responsible for building and operating a unified, scalable, and reliable data platform that serves the entire organization.</p>\n<p>As a Data Platform Product Manager, you will own the strategy and execution for centralizing and formalizing the company&#39;s data infrastructure , spanning internal operational data, transaction and blockchain data, customer data, and external data sources.</p>\n<p>Your mission is to transform a fragmented data landscape into a single source of truth that powers mission-critical reporting, business insights, and downstream product experiences across every team at Anchorage.</p>\n<p>This is a force-multiplier role. Your work will elevate the quality, speed, and reliability of every product and team at the company.</p>\n<p>You will define the standards, build the platform, and create the foundation that enables Anchorage to scale with confidence.</p>\n<p>If you thrive at the intersection of complex data systems, cross-functional influence, and platform thinking, this is your opportunity to have outsized impact at a category-defining company in digital assets.</p>\n<p>Below, we define our Factors of Growth &amp; Impact to help Anchorage Villagers measure their impact and articulate feedback, coaching, and the rich learning that happens while exploring, developing, and mastering capabilities within and beyond the Member of Product, Data Platform role:</p>\n<p><strong>Technical Skills:</strong></p>\n<ul>\n<li>Own the detailed prioritization of the data platform roadmap, balancing foundational infrastructure work, new capabilities, and technical debt.</li>\n<li>Demonstrate deep strategic thinking in shaping the platform roadmap, considering the unique data challenges of digital assets, blockchain protocols, and regulated financial services.</li>\n<li>Deliver complex, cross-functional projects with multiple dependencies across engineering, analytics, compliance, and operations teams.</li>\n<li>Work closely with engineering and data science counterparts to drive product development processes, sprint planning, and architectural decisions.</li>\n<li>Ability to understand and reason about system architecture , including data warehousing, ETL/ELT pipelines, streaming vs. batch processing, and modern data stack components , and communicate clear requirements to engineering.</li>\n<li>Drive comprehensive go-to-market strategy for internal platform adoption, including defining success metrics, tracking KPIs around data quality and platform usage, and iterating based on data-driven insights.</li>\n</ul>\n<p><strong>Complexity and Impact of Work:</strong></p>\n<ul>\n<li>Lead and influence cross-functional teams while maintaining strong stakeholder relationships across the entire organization , from engineering to finance to compliance.</li>\n<li>Exercise independent decision-making and take full ownership of data platform strategy and execution.</li>\n<li>Contribute strategic insights that significantly impact company direction, operational efficiency, and product quality.</li>\n<li>Demonstrate platform leadership that elevates the performance and effectiveness of every team that depends on data.</li>\n</ul>\n<p><strong>Organizational Knowledge:</strong></p>\n<ul>\n<li>Develop deep understanding of Anchorage&#39;s business model, product suite, regulatory environment, and organizational structure.</li>\n<li>Build and maintain strong relationships with stakeholders across all departments to ensure the data platform serves the company&#39;s most critical needs.</li>\n<li>Navigate and improve organizational data practices to enhance efficiency, compliance, and decision-making.</li>\n<li>Drive company objectives through strategic data platform decisions and initiatives.</li>\n</ul>\n<p><strong>Communication and Influence:</strong></p>\n<ul>\n<li>Effectively influence and motivate teams across the organization to adopt platform standards and invest in data quality, even when those teams do not report to you.</li>\n<li>Enable cross-functional collaboration through clear, consistent communication about platform capabilities, timelines, and data governance expectations.</li>\n<li>Act as a thoughtful knowledge partner to senior leadership, translating complex data infrastructure topics into clear business impact.</li>\n<li>Proactively communicate platform goals, status updates, and data health metrics throughout the organization.</li>\n</ul>\n<p><strong>You may be a fit for this role if you:</strong></p>\n<ul>\n<li>5+ years of product management experience, with significant time spent on data platforms, data infrastructure, or data-intensive enterprise products.</li>\n<li>Proven experience building or scaling enterprise data platforms , including data warehousing, data lakes, ETL/ELT pipelines, or modern data stack tooling (e.g., Snowflake, Databricks, dbt, Airflow, Spark).</li>\n<li>Strong understanding of data modeling, data governance, and data quality frameworks.</li>\n<li>Experience working with diverse data types , including transactional data, customer data, financial data, and ideally blockchain or on-chain data.</li>\n<li>Track record of driving cross-functional alignment and adoption for internal platform products where you must influence without direct authority.</li>\n<li>Exceptional written and verbal communication skills, with the ability to convey complex data architecture concepts to both technical and non-technical audiences.</li>\n<li>Your empathy and adaptability not only complement others&#39; working styles but also embody our culture of curiosity, creativity, and shared understanding.</li>\n<li>You self describe as some combination of the following: creative, humble, ambitious, detail oriented, hard working, trustworthy, eager to learn, methodical, action oriented, and tenacious.</li>\n</ul>\n<p><strong>Although not a requirement, bonus points if you have:</strong></p>\n<ul>\n<li>You have hands-on experience with blockchain data indexing, onchain analytics, or crypto-native data infrastructure.</li>\n<li>You have built data platforms that serve both internal analytics consumers and external client-facing products (reports, statements, dashboards).</li>\n<li>You have experience supporting clients with data-related issues or concerns.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3d849fbc-058","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anchorage Digital","sameAs":"https://anchorage.com","logo":"https://logos.yubhub.co/anchorage.com.png"},"x-apply-url":"https://jobs.lever.co/anchorage/0e730f61-a2e4-4152-8277-3f6383cc69a6","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data platforms","data infrastructure","data-intensive enterprise products","data warehousing","data lakes","ETL/ELT pipelines","modern data stack tooling","Snowflake","Databricks","dbt","Airflow","Spark","data modeling","data governance","data quality frameworks","blockchain or on-chain data"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:18:21.529Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data platforms, data infrastructure, data-intensive enterprise products, data warehousing, data lakes, ETL/ELT pipelines, modern data stack tooling, Snowflake, Databricks, dbt, Airflow, Spark, data modeling, data governance, data quality frameworks, blockchain or on-chain data"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_79e9796a-15c"},"title":"Tech Strategy Product Manager, Google DeepMind","description":"<p>You will support the analytical and operational frameworks that enable Leadership (SVP+) decision forums to govern compute utilization and planning. This involves translating complex data into strategic insights that align DeepMind and Google&#39;s broader AI strategy.</p>\n<p>As a Tech Strategy Product Manager, you will conduct rigorous, deep-dive investigations into Google&#39;s hardest long-range strategic questions, providing data-driven clarity needed for executive decision-making. You will model complex scenarios and trade-offs to surface opinionated strategies for maximizing AI value across Google.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Investigative Modeling: Conducting deep-dive investigations into Google&#39;s hardest long-range strategic questions, providing data-driven clarity needed for executive decision-making.</li>\n<li>Scenario Architecture: Modelling complex scenarios and trade-offs to surface opinionated strategies for maximizing AI value across Google.</li>\n<li>SVP+ Content Support: Contributing to the development of high-stakes presentations, communications, and mandates for SVP-level forums, ensuring technical accuracy and narrative flow.</li>\n<li>Cross-Functional Partnership: Acting as a bridge between technical teams and strategy, partnering with virtual teams across Google to ensure product and policy plans are grounded in reality.</li>\n<li>Execution &amp; Monitoring: Helping maintain per-mandate monitoring views to keep programs accountable; collaborating with MLSA to ensure company-wide mandates are organized and delivering on goals.</li>\n<li>Strategic Cataloging: Maintaining a &#39;living catalog&#39; of ongoing experiments and proposed innovations across Google to identify topics worthy of senior executive discussion.</li>\n</ul>\n<p>This role is for you if you demonstrate the analytical rigor and strategic curiosity required to sustain a high level of performance across complex, pan-Google workstreams,pairing a &#39;big picture&#39; perspective with the ability to remain precise and detail-oriented at the speed of an SVP-led forum.</p>\n<p>You should have a BA/BS degree or equivalent practical experience in a technical and/or business area, with 5+ years of experience in product management, management consulting, or a high-growth corporate strategy role. Exceptional organisational talent and experience running high-velocity, high-impact programs are also essential.</p>\n<p>In addition, familiarity with the role of compute and data in building LLMs and the technical capabilities of these models would be an advantage.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_79e9796a-15c","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Google DeepMind","sameAs":"https://deepmind.com/","logo":"https://logos.yubhub.co/deepmind.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/deepmind/jobs/7535915","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["machine learning","compute infrastructure","large-scale data modeling","product management","management consulting","corporate strategy"],"x-skills-preferred":["familiarity with LLMs","technical capabilities of LLMs"],"datePosted":"2026-03-31T18:23:10.390Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Mountain View, California, US"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"machine learning, compute infrastructure, large-scale data modeling, product management, management consulting, corporate strategy, familiarity with LLMs, technical capabilities of LLMs"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0047e1c5-08d"},"title":"Backend Engineer, Forward Deployed Engineering","description":"<p>As a Backend Engineer on the Forward Deployed Engineering team at Stripe, you will work alongside AI agents to serve users at scale. This involves maintaining real-time integration maps, running shadow tests against user setups, and performing automated state reconciliation between Stripe and user systems. Your job is the work that requires an engineer: making judgment calls on ambiguous problems, building relationships with user engineering teams, making product decisions, and designing solutions.</p>\n<p>You will engage directly with users to understand their revenue, billing, and payments requirements. You will translate what you learn into technical solutions and bring that user reality back to product teams. This is a genuinely user-facing role, not user-facing in the &quot;I read a dashboard&quot; sense.</p>\n<p>You will build across product boundaries, designing and deploying products and solutions that address product-market fit gaps, not just in Billing but across multi-product boundaries (Payments + Invoicing + Global LPMs). You will embed within Stripe product engineering teams to co-develop the highest-leverage capabilities.</p>\n<p>You will build reusable solutions, not one-off fixes. You will contribute to a customization framework for RFA and adjacent products: tailored billing logic, financial workflows, integrations (custom metering, product catalog integrations, checkout flows). You will build patterns and blueprints that scale beyond the individual engagement.</p>\n<p>You will provide architectural guidance, reviewing user architectures, advising on best practices, and optimizing integration and performance for complex enterprise environments. You will contribute to a growing library of architectural patterns for the field.</p>\n<p>You will resolve critical technical challenges, diagnosing and fixing complex product/engineering problems across the stack. You will identify systemic improvements that prevent recurrence and improve platform stability.</p>\n<p>You will inform the product roadmap, the integration gaps, migration friction, and multi-product failures you surface directly shape Stripe&#39;s product strategy. You will advocate for what users actually need based on what you&#39;ve seen firsthand.</p>\n<p>You will raise the bar on engineering, improving engineering standards, tooling, and processes within the team. You will help build for sustainability as the team grows.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0047e1c5-08d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Stripe","sameAs":"https://stripe.com/","logo":"https://logos.yubhub.co/stripe.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/stripe/jobs/7249744","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["5+ years of experience in software engineering, with a strong focus on backend systems","Proven ability to design, build, and maintain highly available, scalable, and secure systems","Strong command of distributed systems, API design, and data modeling","Excellent problem-solving skills and the ability to quickly grasp complex technical and business domains","Clear communicator, both written and verbal, with technical and non-technical stakeholders including external users"],"x-skills-preferred":["Experience with financial automation or billing products (e.g., Stripe Billing, Tax, Revenue Recognition, or similar)","Experience with multi-product integration: stitching together payments, invoicing, billing, and related systems","Familiarity with extensibility models, custom solution frameworks, or platform development","Experience working with large enterprise users or in a customer-facing engineering role","Prior experience in a fast-paced, ambiguous environment where priorities shift based on user needs"],"datePosted":"2026-03-31T18:02:33.158Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"N/A"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"5+ years of experience in software engineering, with a strong focus on backend systems, Proven ability to design, build, and maintain highly available, scalable, and secure systems, Strong command of distributed systems, API design, and data modeling, Excellent problem-solving skills and the ability to quickly grasp complex technical and business domains, Clear communicator, both written and verbal, with technical and non-technical stakeholders including external users, Experience with financial automation or billing products (e.g., Stripe Billing, Tax, Revenue Recognition, or similar), Experience with multi-product integration: stitching together payments, invoicing, billing, and related systems, Familiarity with extensibility models, custom solution frameworks, or platform development, Experience working with large enterprise users or in a customer-facing engineering role, Prior experience in a fast-paced, ambiguous environment where priorities shift based on user needs"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4b700ee3-482"},"title":"Analytics Engineer (Finance)","description":"<p>We are looking for an Analytics Engineer to join our team. As an Analytics Engineer, you will be responsible for translating data requirements from across the organisation into robust and reusable data models, with a particular focus on financial regulatory submissions or financial analytics.</p>\n<p>Maintain consistent and clear documentation and communicate with business stakeholders (both technical and non-technical).</p>\n<p>Collaborate with the wider data team to help meet the business goals, including peer reviews.</p>\n<p>Take ownership of a project end-to-end and manage priorities accordingly.</p>\n<p>Our ideal candidate will have strong experience with SQL, experience working within the credit domain, and be a self-starter with the ability to think outside the box.</p>\n<p>They will also have good attention to detail, strong experience with Looker or a similar visualisation tool, and strong communication and documentation skills for both technical and non-technical audiences.</p>\n<p>As a member of our team, you will have the opportunity to work on a wide range of projects and contribute to the development of our data capabilities.</p>\n<p>We offer a competitive salary and benefits package, including 25 days holiday, an extra day&#39;s holiday for your birthday, and annual leave increased with length of service.</p>\n<p>We are an equal opportunities employer and welcome applications from all qualified candidates.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4b700ee3-482","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Starling Bank","sameAs":"https://www.starlingbank.com/","logo":"https://logos.yubhub.co/starlingbank.com.png"},"x-apply-url":"https://apply.workable.com/j/D74D88F51C","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Looker","credit domain","data modelling","financial analytics"],"x-skills-preferred":["dbt","data visualisation"],"datePosted":"2026-03-20T16:15:09.160Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Southampton"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, Looker, credit domain, data modelling, financial analytics, dbt, data visualisation"}]}