{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/modern-data-platforms"},"x-facet":{"type":"skill","slug":"modern-data-platforms","display":"Modern Data Platforms","count":4},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6365e7d7-511"},"title":"Senior Forward Deployed Data Scientist/Engineer","description":"<p>We&#39;re hiring a Senior Forward Deployed Data Scientist / Engineer to work directly with customers on ambiguous, high-impact problems at the intersection of data science, product development, and AI deployment.</p>\n<p>This is not a traditional analytics role. On this team, data scientists do the core statistical and modeling work, but they also build real tools and products: evaluation explorers, operator workflows, decision-support systems, experimentation surfaces, and customer-specific AI/data applications that get used in production.</p>\n<p>The right candidate is strong in first-principles problem solving, rigorous measurement, and technical execution. They know how to define metrics, design experiments, diagnose failures, and build systems that people actually use. They are also comfortable using modern AI-assisted development tools to prototype and iterate quickly without sacrificing reliability, observability, or judgment. Python and SQL matter in this role, but as execution fluency in service of building better products and making better decisions.</p>\n<p>Responsibilities: Partner directly with enterprise customers to understand workflows, operational pain points, constraints, and success criteria Turn ambiguous business and product problems into measurable solutions with clear metrics, technical designs, and deployment plans Design and build internal and customer-facing data products, including evaluation tools, workflow applications, decision-support systems, and thin product layers on top of data/ML systems Build end-to-end solutions across data ingestion, transformation, experimentation, statistical modeling, deployment, monitoring, and iteration Design evaluation frameworks, benchmarks, and feedback loops for ML/LLM systems, human-in-the-loop workflows, and model-assisted operations Apply rigorous statistical thinking to experimentation, causal inference, metric design, forecasting, segmentation, diagnostics, and performance measurement Use AI-assisted development workflows to accelerate prototyping and product iteration, while maintaining strong engineering discipline Diagnose failure modes across data quality, model behavior, retrieval, workflow design, and user experience, and drive fixes into production Act as the voice of the customer to Product, Engineering, and Data Science, using field learnings to shape roadmap and platform capabilities</p>\n<p>Requirements: 5+ years of experience in data science, machine learning, quantitative engineering, or another highly analytical technical role Proven track record of shipping data, ML, or AI systems that delivered measurable business or product impact Exceptional ability to structure ambiguous problems, define the right success metrics, and translate them into executable technical plans Strong foundation in statistics, experimentation, causal reasoning, and measurement Experience building tools or products, not just analyses , for example internal workflow tools, evaluation systems, operator-facing products, experimentation platforms, or customer-specific applications Hands-on fluency in Python, SQL, and modern data/AI tooling; able to inspect data, prototype quickly, debug deeply, and productionize solutions that work Comfort using AI-assisted coding and development workflows to move from idea to usable product quickly Strong communication and stakeholder management skills; able to work effectively with customers, engineers, product teams, and executives High ownership and bias toward shipping in fast-moving environments with incomplete information</p>\n<p>Preferred qualifications: Experience in a forward deployed, solutions, consulting, or other client-facing technical role Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign</p>\n<p>What success looks like: Success in this role means taking a messy, high-stakes customer problem and turning it into a deployed system that is actually used. Sometimes that system is a model. Sometimes it is an evaluation framework. Sometimes it is an operator-facing tool or a lightweight data product that changes how decisions get made. In all cases, success is defined by measurable impact, rigorous evaluation, and reliable execution.</p>\n<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training. Scale employees in eligible roles are also granted equity based compensation, subject to Board of Director approval. Your recruiter can share more about the specific salary range for your preferred location during the hiring process, and confirm whether the hired role will be eligible for equity grant. You’ll also receive benefits including, but not limited to: Comprehensive health, dental and vision coverage, retirement benefits, a learning and development stipend, and generous PTO. Additionally, this role may be eligible for additional benefits such as a commuter stipend.</p>\n<p>Salary Range: $167,200-$209,000 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6365e7d7-511","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale AI","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4636227005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$167,200-$209,000 USD","x-skills-required":["Python","SQL","Modern data/AI tooling","Statistics","Experimentation","Causal reasoning","Measurement","Data science","Machine learning","Quantitative engineering"],"x-skills-preferred":["Experience in a forward deployed, solutions, consulting, or other client-facing technical role","Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products","Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow","Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery","Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems","Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling","Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign"],"datePosted":"2026-04-18T15:59:44.618Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA; New York, NY"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, SQL, Modern data/AI tooling, Statistics, Experimentation, Causal reasoning, Measurement, Data science, Machine learning, Quantitative engineering, Experience in a forward deployed, solutions, consulting, or other client-facing technical role, Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products, Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow, Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery, Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems, Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling, Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":167200,"maxValue":209000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_58a44dab-91a"},"title":"Partner Solutions Architect - Japan","description":"<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across Japan. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>You will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud.</p>\n<p>Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing. This is not a purely reactive enablement role. The Partner SA is expected to help shape and execute repeatable partner plays that create revenue.</p>\n<p>That includes enabling partner sellers and architects, supporting account mapping and seller-to-seller engagement, helping define joint value propositions, supporting partner-led pipeline generation, and influencing product and field strategy based on what is learned in-market.</p>\n<p>Internal operating docs show this motion consistently includes enablement sessions, QBR sponsorships, account planning, workshops, field events, and targeted campaigns designed to produce sourced and influenced pipeline.</p>\n<p>You&#39;ll be part of a team helping dbt scale its ecosystem through better partner capability, tighter field alignment, and more repeatable pipeline generation. The role is especially important as dbt continues investing in structured partner motions and deeper engagement with major cloud and data platform partners.</p>\n<p>What you&#39;ll do:</p>\n<ul>\n<li>Partner closely with Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n</ul>\n<ul>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n</ul>\n<ul>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n</ul>\n<ul>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n</ul>\n<ul>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n</ul>\n<ul>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n</ul>\n<ul>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n</ul>\n<ul>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n</ul>\n<ul>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n</ul>\n<ul>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n</ul>\n<ul>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<ul>\n<li>Travel approximately 30-40% to support partner planning, enablement, executive meetings, and field events across Japan</li>\n</ul>\n<p>This scope reflects how the Partner SA team is already operating: enabling partner field teams, building account-level alignment, supporting QBRs and regional events, and translating those activities into sourced and engaged pipeline.</p>\n<p>What you&#39;ll need:</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n</ul>\n<ul>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n</ul>\n<ul>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n</ul>\n<ul>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n</ul>\n<ul>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n</ul>\n<ul>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n</ul>\n<ul>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n</ul>\n<ul>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n</ul>\n<ul>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n</ul>\n<ul>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out:</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n</ul>\n<ul>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n</ul>\n<ul>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n</ul>\n<ul>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n</ul>\n<ul>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n</ul>\n<ul>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n</ul>\n<ul>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n</ul>\n<ul>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>What to expect in the interview process (all video interviews unless accommodations are needed):</p>\n<ul>\n<li>Interview with Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Interview with Hiring Manager</li>\n</ul>\n<ul>\n<li>Team Interviews</li>\n</ul>\n<ul>\n<li>Demo Round</li>\n</ul>\n<p>#LI-LA1</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_58a44dab-91a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673657005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner engineering","customer-facing technical role"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:53:29.744Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Japan - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner engineering, customer-facing technical role, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3168d7d3-70b"},"title":"Partner Solutions Architect - North America","description":"<p>About Us</p>\n<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across North America. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>As a Partner Solutions Architect, you will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud. Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Partner closely with North America Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>Benefits</p>\n<ul>\n<li>Unlimited vacation (and yes we use it!)</li>\n<li>Pension coverage</li>\n<li>Excellent healthcare</li>\n<li>Paid Parental Leave</li>\n<li>Wellness stipend</li>\n<li>Home office stipend, and more!</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3168d7d3-70b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673630005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner development","field engineering","sales engineering","consulting","partner engineering"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:48:30.813Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Canada - Remote; US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner development, field engineering, sales engineering, consulting, partner engineering, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7ffabac7-275"},"title":"Director, Solutions & Forward Deployed Engineering","description":"<p>We are seeking a Director, Solution &amp; Forward Deployed Engineering to lead the technical delivery of the Zus platform and help customers successfully connect their systems, data, and applications to Zus.</p>\n<p>Reporting to the Head of Customer Success &amp; Delivery, this role will own how customers integrate with the Zus platform. They’ll be responsible for ensuring healthcare organisations and digital health builders can reliably ingest data, connect EHR systems, and deploy applications powered by Zus APIs.</p>\n<p>This leader will oversee teams responsible for forward deployment engineering, and technical enablement, working closely with customer engineering teams to integrate Zus into production environments and connect to data networks.</p>\n<p>You will guide customers through the complexity of healthcare interoperability, helping them translate real-world workflows into scalable integrations built on Zus.</p>\n<p>This is a hands-on player-coach role. You will lead the team while also personally engaging in complex implementations, architecture discussions, and customer deployments.</p>\n<p>You will champion the use of AI tools, automation frameworks, and reusable integration patterns to dramatically improve how quickly and reliably customers connect to the Zus platform.</p>\n<p>The ideal candidate combines deep experience in healthcare interoperability, enterprise software implementations, API platforms, and AI-enabled engineering workflows with the leadership skills required to scale a delivery organisation.</p>\n<p>Key responsibilities:</p>\n<ul>\n<li>Lead implementation and technical delivery - Own the technical delivery lifecycle following contract signature through production deployment and early adoption</li>\n<li>Lead and grow a team of Solutions Engineers and Forward Deployed Engineers - Develop career paths, performance expectations, and development plans for the team to ensure excellent execution of goals</li>\n<li>Ensure consistent, high-quality execution across multiple concurrent enterprise implementations</li>\n<li>Establish best practices for onboarding, implementation, integration, and go-live readiness</li>\n<li>Set customers up for success across multiple different high priority use cases</li>\n<li>Ensure customers achieve rapid time-to-value from the Zus platform</li>\n</ul>\n<p>Act as player-coach for complex implementations - Personally engage on strategic or technically complex customer deployments</p>\n<p>Guide integrations involving FHIR, HL7, CCD, APIs, SFTP pipelines, and EHR platforms</p>\n<p>Troubleshoot complex interoperability and data pipeline issues</p>\n<p>Work directly with engineering teams to deploy and operationalize Zus products</p>\n<p>Serve as a trusted technical advisor to customer technical and operational stakeholders</p>\n<p>Drive forward deployed engineering - Support customers in building production-grade applications and workflows on top of Zus APIs</p>\n<p>Help customers operationalize clinical and operational data across care delivery workflows</p>\n<p>Lead the development of reference architectures and deployment patterns</p>\n<p>Identify integration opportunities that accelerate product adoption and expansion</p>\n<p>Delivery training and technical enablement - Oversee technical onboarding and training programs for new customers</p>\n<p>Enable customer engineering and product teams to effectively build on the Zus platform</p>\n<p>Develop documentation, workshops, and enablement resources for technical users</p>\n<p>Drive AI-enabled implementation and automation - Lead the adoption of AI tools and automation frameworks across the delivery organisation</p>\n<p>Identify opportunities to automate manual implementation work using LLMs, scripting, and developer tooling</p>\n<p>Develop reusable automation patterns for all parts of the Zus ecosystem</p>\n<p>Help customers leverage Zus data to power AI-enabled workflows and analytics applications</p>\n<p>Partner with Product and Engineering - Translate customer implementation patterns into platform improvement</p>\n<p>Participate in technical discussions to find reusable integration patterns that can be embedded directly into the Zus platform</p>\n<p>Communicate customer needs to the Product &amp; Engineering teams</p>\n<p>You&#39;re a good fit because you have:</p>\n<ul>\n<li>10+ years of experience in technical implementation, solutions engineering, systems integration, or professional services leadership, preferably in healthtech, SaaS, or enterprise software</li>\n<li>Proven experience leading customer-facing teams and scaling implementation or professional services functions</li>\n<li>Deep expertise in healthcare data interoperability, including FHIR, HL7, CCD, and EHR integrations</li>\n<li>Strong understanding of APIs, data ingestion pipelines (ETL, JSON, CSV), and modern data platforms (e.g., Snowflake)</li>\n<li>Experience designing scalable implementation frameworks and reusable integration patterns</li>\n<li>Familiarity with secure environments and compliance frameworks (HIPAA, SOC 2)</li>\n<li>Executive presence and the ability to build trust with both technical and non-technical stakeholders</li>\n<li>Strong strategic thinking paired with a willingness to dive into complex technical or delivery challenges when needed</li>\n<li>A self-starter mindset and comfort operating in a fast-paced, evolving startup environment</li>\n<li>Passion for improving healthcare through better access to and use of data</li>\n<li>Willingness to travel up to ~25% for customer engagements, industry events, and company meetings</li>\n<li>Bachelor’s degree in Business, Engineering, or a related field (advanced degree a plus)</li>\n</ul>\n<p>Additional Information:</p>\n<p>We will offer you...</p>\n<ul>\n<li>Competitive compensation that reflects the value you bring to the team a combination of cash and equity</li>\n<li>Robust benefits that include health insurance, wellness benefits, 401k with a match, unlimited PTO</li>\n<li>Opportunity to work alongside a passionate team that is determined to help change the world (and have fun doing it)</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7ffabac7-275","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Zus","sameAs":"https://zus.com/","logo":"https://logos.yubhub.co/zus.com.png"},"x-apply-url":"https://jobs.lever.co/zushealth/de7b4911-901f-4548-9d68-9b77c0ccf6b6","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$150,000-200,000 per year","x-skills-required":["Healthcare data interoperability","Enterprise software implementations","API platforms","AI-enabled engineering workflows","Leadership skills","FHIR","HL7","CCD","EHR integrations","APIs","Data ingestion pipelines","Modern data platforms","Scalable implementation frameworks","Reusable integration patterns","Secure environments","Compliance frameworks","Executive presence","Strategic thinking","Self-starter mindset","Passion for improving healthcare"],"x-skills-preferred":[],"datePosted":"2026-04-17T13:13:25.945Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"Healthcare data interoperability, Enterprise software implementations, API platforms, AI-enabled engineering workflows, Leadership skills, FHIR, HL7, CCD, EHR integrations, APIs, Data ingestion pipelines, Modern data platforms, Scalable implementation frameworks, Reusable integration patterns, Secure environments, Compliance frameworks, Executive presence, Strategic thinking, Self-starter mindset, Passion for improving healthcare","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":150000,"maxValue":200000,"unitText":"YEAR"}}}]}