{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/data-platform"},"x-facet":{"type":"skill","slug":"data-platform","display":"Data Platform","count":100},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_52a09fd7-0a6"},"title":"Trade Floor Support Engineer","description":"<p>We are seeking a highly skilled Trade Floor Support Engineer to join our team. The successful candidate will be responsible for delivering high-quality technical support to end-users in a fast-paced trading environment.</p>\n<p>Key responsibilities include serving as the primary interface with trading business units, ensuring seamless technology operations, and proactively addressing issues. The ideal candidate will thrive in a collaborative environment, demonstrate a strong sense of ownership, and be passionate about providing world-class support to drive business success.</p>\n<p>The role requires a minimum of 5+ years of progressive technical support experience in an enterprise-level environment, preferably within the financial industry. The ideal candidate will have strong knowledge of Active Directory and Exchange, familiarity with ITIL frameworks and tools such as ServiceNow, and proven experience with market data platforms and vendor integrations.</p>\n<p>In addition to the above requirements, the successful candidate will be able to prioritize and perform under pressure in a fast-moving, constantly changing environment. They will also be able to maintain and expand technical knowledge through continuous learning, with a focus on delivering exceptional customer support.</p>\n<p>This is an exciting opportunity to work at the heart of a fast-paced trading environment, where technology and business intersect. You will have the chance to collaborate with top-tier professionals, tackle complex challenges, and make a direct impact on the success of our trading operations.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_52a09fd7-0a6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"IT Infrastructure","sameAs":"https://mlp.eightfold.ai","logo":"https://logos.yubhub.co/mlp.eightfold.ai.png"},"x-apply-url":"https://mlp.eightfold.ai/careers/job/755955158840","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$175,000 to $250,000","x-skills-required":["Active Directory","Exchange","ServiceNow","Market data platforms","Vendor integrations","Windows environments","Microsoft Windows OS","Microsoft Office Suite","Mobile devices","VDI and Citrix environments","Basic network and telecommunications connectivity"],"x-skills-preferred":[],"datePosted":"2026-04-18T22:12:39.378Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States of America"}},"employmentType":"FULL_TIME","occupationalCategory":"IT","industry":"Finance","skills":"Active Directory, Exchange, ServiceNow, Market data platforms, Vendor integrations, Windows environments, Microsoft Windows OS, Microsoft Office Suite, Mobile devices, VDI and Citrix environments, Basic network and telecommunications connectivity","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":175000,"maxValue":250000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1d84bf66-726"},"title":"Senior Global Medical Affairs Leader","description":"<p>Location: Gaithersburg Hybrid: 3 days a week on site Reporting to the Medical Head of Emerging Medicine, the Senior Global Medical Affairs Leader is an experienced medical affairs leader with a strong record of vision and impact. This role designs and drives bold global medical strategies within the CVRM Emerging Medicine therapy area, steering cross-functional and cross-regional teams to deliver ambitious outcomes. Acting as a key enterprise leader, the Senior GMAL shapes global medical agendas through leadership of Global Medical Teams, Global Product Teams and Disease Area Teams, while partnering closely with Medical and Commercial colleagues across markets and regions. This position champions innovative, disruptive thinking, identifies breakthrough opportunities, reimagines market dynamics, and accelerates high-impact evidence generation and medical change initiatives. It calls for someone ready to navigate and transform health systems, harness digital and advanced analytics, drive stakeholder engagement, and ensure AstraZeneca’s science leads both externally and internally through world-class evidence and strategic partnerships. Are you ready to set a new standard for what medical affairs can achieve?</p>\n<p>Accountabilities The Senior GMAL leads the design and execution of global product and disease area strategies, translating scientific insight and healthcare system understanding into clear, practice-changing medical plans. This includes shaping ambitious practice change initiatives that influence care pathways, market access, and treatment adoption across diverse geographies. The role is responsible for the global evidence generation strategy, from interventional trials (Phases 2–4) to real-world evidence programs, ensuring that clinical research, registries and RWE studies are aligned with brand, access and healthcare transformation objectives. It drives optimisation and communication of data across scientific platforms, ensuring differentiated evidence reaches regulators, payers and healthcare professionals in a timely and impactful way. A core responsibility is to blend advanced scientific expertise with digital skill, using digital health tools, analytics and data platforms to transform how evidence is communicated and how stakeholders are engaged. The Senior GMAL crafts compelling scientific narratives, leads digital dissemination of evidence, and safeguards accuracy, impact and compliance in all scientific communications. The role builds and sustains influential relationships across the healthcare ecosystem, including global and regional KOLs, professional societies, regulators, payers, patient organisations, digital innovators and health systems. It mobilises support for medical strategy and healthcare change initiatives, while mentoring cross-functional teams and encouraging scientific leadership and collaboration. The Senior GMAL also brings deep regulatory insight to the design of clinical trials and evidence programs, anticipating evolving regulatory and payer requirements to ensure robust, compliant data generation and communication. This includes guiding teams on regulatory expectations, shaping evidence packages for submissions and HTA processes, and ensuring that medical plans are fully integrated with commercial, access and lifecycle strategies. Throughout, the role acts with urgency, challenges the status quo, spots new opportunities in data and science, and leads teams through complex decision-making to deliver significant impact for patients worldwide.</p>\n<p>Essential Skills/Experience</p>\n<ul>\n<li>Master’s degree in any field with at least 5 years of experience in the pharmaceutical industry (Medical, Marketing, R&amp;D, Market Access), or extensive experience within a health system (clinical, pharmacy, pathway design, etc.)</li>\n<li>Demonstrated leadership in complex, global healthcare environments, including leading franchise/TA business in key markets, driving brand performance, and leading large, diverse teams.</li>\n<li>Significant expertise and strategic insight into healthcare system pathways, regulatory landscapes, market-shaping, and external stakeholder engagement.</li>\n<li>Consistent track record of internal and external network building, collaboration across geographies and matrix teams, and effective mentoring of junior colleagues or peers.</li>\n<li>Digital proficiency, maximising digital health tools and data platforms to advance evidence generation and communication.</li>\n<li>Deep understanding of patient care pathways, evolving treatment protocols, scientific literature, and contemporary approaches to healthcare transformation.</li>\n<li>Shown ability to guide, influence, and align diverse stakeholders,including thought leaders, regulatory bodies, patient groups, and payer organisations.</li>\n</ul>\n<p>Desirable Skills/Experience</p>\n<ul>\n<li>Doctoral-level degree (MD, PhD, PharmD) in a relevant field.</li>\n<li>Recognised expertise in the relevant or related therapeutic area, often evidenced by a history of clinical practice, research leadership, or significant external engagement.</li>\n<li>Advanced experience designing, leading, and interpreting clinical research programs, including pivotal clinical trials (Ph2–Ph4), practice-changing RWE studies, and implementation of innovative evidence generation approaches.</li>\n<li>Broad experience beyond Medical Affairs; for example, prior roles in Marketing, R&amp;D, or Market Access.</li>\n<li>Long-standing relationships with global KOLs, societies, regulators, digital health partners, and innovative healthcare organisations.</li>\n<li>Track record of launching new products or indications in multiple, diverse markets (e.g., US, China, EU), and successfully developing and delivering global medical strategies.</li>\n</ul>\n<p>AstraZeneca offers an environment built on innovation and curiosity where difference is valued and questioning minds are encouraged to challenge convention; teams reflect the diversity of the communities they serve, work closely together across disciplines and geographies, embrace digital and data to solve complex challenges at pace, learn continuously through shared insight and feedback, and channel their entrepreneurial spirit into developing the next generation of therapeutics that improve outcomes for patients and society. If this role matches your experience and ambitions, apply now to join us on this journey! Are you ready to bring insights and fresh thinking to the table? Brilliant! We have one seat available, and we hope it’s yours. Apply today.</p>\n<p>AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We follow all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.</p>\n<p>The annual base pay for this position ranges from $243,586.40 - $336,379.60 USD , Hourly and salaried non-exempt employees will also be paid overtime pay when working qualifying overtime hours. Base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. In addition, our positions offer a short-term incentive bonus opportunity; eligibility to participate in our equity-based long-term incentive program (salaried roles), to receive a retirement contribution (hourly roles), and commission payment eligibility (sales roles). Benefits offered included a qualified retirement program [401(k) plan]; paid vacation and holidays; paid leaves; and, health benefits.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1d84bf66-726","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Global BPM CVRM Emerging Medicines","sameAs":"https://astrazeneca.eightfold.ai","logo":"https://logos.yubhub.co/astrazeneca.eightfold.ai.png"},"x-apply-url":"https://astrazeneca.eightfold.ai/careers/job/563877689800017","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$243,586.40 - $336,379.60 USD","x-skills-required":["Master’s degree in any field","5 years of experience in the pharmaceutical industry","Leadership in complex, global healthcare environments","Significant expertise and strategic insight into healthcare system pathways","Digital proficiency, maximising digital health tools and data platforms"],"x-skills-preferred":["Doctoral-level degree (MD, PhD, PharmD)","Recognised expertise in the relevant or related therapeutic area","Advanced experience designing, leading, and interpreting clinical research programs","Broad experience beyond Medical Affairs","Long-standing relationships with global KOLs, societies, regulators, digital health partners, and innovative healthcare organisations"],"datePosted":"2026-04-18T22:12:21.577Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Gaithersburg, Maryland, United States of America"}},"employmentType":"FULL_TIME","occupationalCategory":"Medical","industry":"Healthcare","skills":"Master’s degree in any field, 5 years of experience in the pharmaceutical industry, Leadership in complex, global healthcare environments, Significant expertise and strategic insight into healthcare system pathways, Digital proficiency, maximising digital health tools and data platforms, Doctoral-level degree (MD, PhD, PharmD), Recognised expertise in the relevant or related therapeutic area, Advanced experience designing, leading, and interpreting clinical research programs, Broad experience beyond Medical Affairs, Long-standing relationships with global KOLs, societies, regulators, digital health partners, and innovative healthcare organisations","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":243586.4,"maxValue":336379.6,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a3618001-1b4"},"title":"GTM Architect","description":"<p>The GTM Architect, Enterprise will report to the Head of Enterprise GTM and will own the operating systems, measurement, and engagement model that power Scale AI&#39;s enterprise revenue motion.</p>\n<p>This role is responsible for designing and running the enterprise GTM architecture across RevOps, sales performance, and cross-functional execution, ensuring Scale&#39;s largest and most strategic accounts are engaged with rigor, consistency, and impact.</p>\n<p>The GTM Architect will bring a strong point of view on how Scale engages enterprise accounts, how performance is measured, and how teams operate day-to-day to drive predictable growth. Over time, this role will have the opportunity to build and lead a GTM / RevOps team.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Own and evolve Scale&#39;s enterprise GTM operating model, including account engagement strategy, funnel design, and sales execution standards</li>\n</ul>\n<ul>\n<li>Design and manage the enterprise RevOps systems stack (e.g., CRM, forecasting, reporting, planning tools) to support scalable growth</li>\n</ul>\n<ul>\n<li>Define, track, and operationalize core enterprise GTM metrics across pipeline health, deal velocity, forecast accuracy, and rep productivity</li>\n</ul>\n<ul>\n<li>Establish and run sales operating cadences including pipeline reviews, forecast calls, QBRs, and performance reviews</li>\n</ul>\n<ul>\n<li>Partner with Sales Leadership and Finance to design, implement, and maintain enterprise sales compensation plans, including quota governance and attainment reporting</li>\n</ul>\n<ul>\n<li>Build executive-level dashboards and reporting to support GTM decision-making and leadership visibility</li>\n</ul>\n<ul>\n<li>Serve as a strategic thought partner to Enterprise Sales leaders, bringing a strong opinion on how accounts should be covered, prioritized, and engaged</li>\n</ul>\n<ul>\n<li>Act as the connective tissue across Sales, Marketing, Finance, Product, and Solutions Engineering for enterprise GTM planning and execution</li>\n</ul>\n<ul>\n<li>Support annual and quarterly planning efforts including territory design, capacity modeling, headcount planning, and quota setting</li>\n</ul>\n<ul>\n<li>Ensure data integrity, process clarity, and operational discipline across all enterprise GTM motions</li>\n</ul>\n<p>Ideally, You Will Have:</p>\n<ul>\n<li>8–12+ years of experience in RevOps, Sales Ops, or GTM Strategy roles, with deep exposure to enterprise sales environments</li>\n</ul>\n<ul>\n<li>Experience supporting complex, multi-stakeholder enterprise sales motions in high-growth B2B SaaS or platform companies</li>\n</ul>\n<ul>\n<li>Proven ownership of sales systems, forecasting, and performance measurement at scale</li>\n</ul>\n<ul>\n<li>Hands-on experience designing and managing enterprise sales compensation plans and reporting</li>\n</ul>\n<ul>\n<li>Strong understanding of enterprise GTM metrics, planning cycles, and operating cadences</li>\n</ul>\n<ul>\n<li>A clear point of view on how enterprise accounts should be engaged and how to operationalize that engagement at scale</li>\n</ul>\n<ul>\n<li>Comfort influencing senior sales leaders and executives without direct authority</li>\n</ul>\n<ul>\n<li>Excellent written and verbal communication skills, including experience building executive-level materials and dashboards</li>\n</ul>\n<ul>\n<li>Strong command of GTM systems and tools (e.g., Salesforce, Clari, planning and reporting tools)</li>\n</ul>\n<ul>\n<li>High attention to detail paired with the ability to operate at a strategic altitude</li>\n</ul>\n<ul>\n<li>Demonstrated ability to operate as a senior IC with the ambition and capability to build and lead a team over time</li>\n</ul>\n<ul>\n<li>Technical curiosity or experience working alongside technical products and teams; familiarity with AI, ML, or data platforms is a plus</li>\n</ul>\n<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training. Scale employees in eligible roles are also granted equity based compensation, subject to Board of Director approval. Your recruiter can share more about the specific salary range for your preferred location during the hiring process, and confirm whether the hired role will be eligible for equity grant. You’ll also receive benefits including, but not limited to: Comprehensive health, dental and vision coverage, retirement benefits, a learning and development stipend, and generous PTO. Additionally, this role may be eligible for additional benefits such as a commuter stipend.</p>\n<p>Please reference the job posting&#39;s subtitle for where this position will be located. For pay transparency purposes, the base salary range for this full-time position in the locations of San Francisco, New York, Seattle is: $176,000-$220,000 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a3618001-1b4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale AI","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4662232005","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$176,000-$220,000 USD","x-skills-required":["RevOps","Sales Ops","GTM Strategy","CRM","Forecasting","Reporting","Planning Tools","Sales Compensation Plans","Quota Governance","Attainment Reporting","Executive-Level Dashboards","Leadership Visibility","Strategic Thought Partner","Enterprise Sales Leaders","Account Engagement Strategy","Funnel Design","Sales Execution Standards","Data Integrity","Process Clarity","Operational Discipline"],"x-skills-preferred":["AI","ML","Data Platforms","Technical Products","GTM Systems","Tools","Salesforce","Clari","Planning and Reporting Tools"],"datePosted":"2026-04-18T16:02:00.236Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"RevOps, Sales Ops, GTM Strategy, CRM, Forecasting, Reporting, Planning Tools, Sales Compensation Plans, Quota Governance, Attainment Reporting, Executive-Level Dashboards, Leadership Visibility, Strategic Thought Partner, Enterprise Sales Leaders, Account Engagement Strategy, Funnel Design, Sales Execution Standards, Data Integrity, Process Clarity, Operational Discipline, AI, ML, Data Platforms, Technical Products, GTM Systems, Tools, Salesforce, Clari, Planning and Reporting Tools","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":176000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6365e7d7-511"},"title":"Senior Forward Deployed Data Scientist/Engineer","description":"<p>We&#39;re hiring a Senior Forward Deployed Data Scientist / Engineer to work directly with customers on ambiguous, high-impact problems at the intersection of data science, product development, and AI deployment.</p>\n<p>This is not a traditional analytics role. On this team, data scientists do the core statistical and modeling work, but they also build real tools and products: evaluation explorers, operator workflows, decision-support systems, experimentation surfaces, and customer-specific AI/data applications that get used in production.</p>\n<p>The right candidate is strong in first-principles problem solving, rigorous measurement, and technical execution. They know how to define metrics, design experiments, diagnose failures, and build systems that people actually use. They are also comfortable using modern AI-assisted development tools to prototype and iterate quickly without sacrificing reliability, observability, or judgment. Python and SQL matter in this role, but as execution fluency in service of building better products and making better decisions.</p>\n<p>Responsibilities: Partner directly with enterprise customers to understand workflows, operational pain points, constraints, and success criteria Turn ambiguous business and product problems into measurable solutions with clear metrics, technical designs, and deployment plans Design and build internal and customer-facing data products, including evaluation tools, workflow applications, decision-support systems, and thin product layers on top of data/ML systems Build end-to-end solutions across data ingestion, transformation, experimentation, statistical modeling, deployment, monitoring, and iteration Design evaluation frameworks, benchmarks, and feedback loops for ML/LLM systems, human-in-the-loop workflows, and model-assisted operations Apply rigorous statistical thinking to experimentation, causal inference, metric design, forecasting, segmentation, diagnostics, and performance measurement Use AI-assisted development workflows to accelerate prototyping and product iteration, while maintaining strong engineering discipline Diagnose failure modes across data quality, model behavior, retrieval, workflow design, and user experience, and drive fixes into production Act as the voice of the customer to Product, Engineering, and Data Science, using field learnings to shape roadmap and platform capabilities</p>\n<p>Requirements: 5+ years of experience in data science, machine learning, quantitative engineering, or another highly analytical technical role Proven track record of shipping data, ML, or AI systems that delivered measurable business or product impact Exceptional ability to structure ambiguous problems, define the right success metrics, and translate them into executable technical plans Strong foundation in statistics, experimentation, causal reasoning, and measurement Experience building tools or products, not just analyses , for example internal workflow tools, evaluation systems, operator-facing products, experimentation platforms, or customer-specific applications Hands-on fluency in Python, SQL, and modern data/AI tooling; able to inspect data, prototype quickly, debug deeply, and productionize solutions that work Comfort using AI-assisted coding and development workflows to move from idea to usable product quickly Strong communication and stakeholder management skills; able to work effectively with customers, engineers, product teams, and executives High ownership and bias toward shipping in fast-moving environments with incomplete information</p>\n<p>Preferred qualifications: Experience in a forward deployed, solutions, consulting, or other client-facing technical role Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign</p>\n<p>What success looks like: Success in this role means taking a messy, high-stakes customer problem and turning it into a deployed system that is actually used. Sometimes that system is a model. Sometimes it is an evaluation framework. Sometimes it is an operator-facing tool or a lightweight data product that changes how decisions get made. In all cases, success is defined by measurable impact, rigorous evaluation, and reliable execution.</p>\n<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training. Scale employees in eligible roles are also granted equity based compensation, subject to Board of Director approval. Your recruiter can share more about the specific salary range for your preferred location during the hiring process, and confirm whether the hired role will be eligible for equity grant. You’ll also receive benefits including, but not limited to: Comprehensive health, dental and vision coverage, retirement benefits, a learning and development stipend, and generous PTO. Additionally, this role may be eligible for additional benefits such as a commuter stipend.</p>\n<p>Salary Range: $167,200-$209,000 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6365e7d7-511","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale AI","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4636227005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$167,200-$209,000 USD","x-skills-required":["Python","SQL","Modern data/AI tooling","Statistics","Experimentation","Causal reasoning","Measurement","Data science","Machine learning","Quantitative engineering"],"x-skills-preferred":["Experience in a forward deployed, solutions, consulting, or other client-facing technical role","Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products","Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow","Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery","Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems","Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling","Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign"],"datePosted":"2026-04-18T15:59:44.618Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA; New York, NY"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, SQL, Modern data/AI tooling, Statistics, Experimentation, Causal reasoning, Measurement, Data science, Machine learning, Quantitative engineering, Experience in a forward deployed, solutions, consulting, or other client-facing technical role, Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products, Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow, Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery, Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems, Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling, Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":167200,"maxValue":209000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b68ff4cc-e74"},"title":"Data Engineer, Safeguards","description":"<p><strong>About the role</strong></p>\n<p>Anthropic is looking for a Data Engineer to join the Safeguards team and build the data foundations that keep our AI systems safe. The Safeguards team works to monitor models, prevent misuse, and ensure user well-being.</p>\n<p>You&#39;ll design and build the data pipelines, warehousing solutions, and analytical tooling that power our safety and trust efforts at scale. You&#39;ll work closely with engineers, data scientists, and policy teams to ensure the Safeguards organization has the data it needs to detect abuse patterns, measure the effectiveness of safety interventions, and make informed decisions about model behavior and enforcement.</p>\n<p>This is a high-impact role where your work will directly support Anthropic&#39;s mission to develop AI that is safe and beneficial.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design, build, and maintain scalable data pipelines that support safety monitoring, abuse detection, and enforcement workflows</li>\n<li>Develop and optimize data models and warehousing solutions to enable efficient analysis of large-scale usage and safety data</li>\n<li>Build and maintain dashboards and reporting infrastructure that give Safeguards teams visibility into model behavior, misuse patterns, and enforcement outcomes</li>\n<li>Collaborate with engineers to integrate data from multiple sources , including model outputs, user reports, and automated classifiers , into a unified analytical layer</li>\n<li>Implement data quality frameworks, monitoring, and alerting to ensure the reliability of safety-critical data</li>\n<li>Partner with research teams to surface data insights that inform model improvements and safety interventions</li>\n<li>Develop self-service data tooling that enables stakeholders to explore safety data and generate reports independently</li>\n<li>Contribute to data governance practices, including access controls, retention policies, and privacy-compliant data handling</li>\n</ul>\n<p><strong>You may be a good fit if you:</strong></p>\n<ul>\n<li>Have 3+ years of experience in data engineering, analytics engineering, or a related role</li>\n<li>Are proficient in SQL and Python, with experience building and maintaining ETL/ELT pipelines</li>\n<li>Have hands-on experience with modern data stack tools such as dbt, Airflow, Spark, or similar orchestration and transformation frameworks</li>\n<li>Have worked with cloud data platforms (BigQuery, Redshift, Snowflake, or similar)</li>\n<li>Are comfortable building dashboards and data visualizations using tools like Looker, Tableau, or Metabase</li>\n<li>Communicate clearly and can translate complex data concepts for both technical and non-technical audiences</li>\n<li>Are results-oriented, flexible, and willing to pick up slack even when it falls outside your job description</li>\n<li>Care about the societal impacts of AI and are motivated by safety work</li>\n</ul>\n<p><strong>Strong candidates may have:</strong></p>\n<ul>\n<li>Experience with trust &amp; safety, integrity, fraud, or abuse detection data systems</li>\n<li>Experience with large-scale event streaming systems (Kafka, Pub/Sub, Kinesis)</li>\n<li>Built data infrastructure that supports ML model monitoring or evaluation</li>\n<li>A background in statistical analysis, or experience collaborating closely with data scientists</li>\n<li>Developed internal tooling or self-service analytics platforms</li>\n</ul>\n<p><strong>Strong candidates need not have:</strong></p>\n<ul>\n<li>A formal degree in Computer Science or a related field , we value practical experience and demonstrated ability over credentials</li>\n<li>Prior experience in AI or machine learning , you&#39;ll learn the domain-specific context on the job</li>\n<li>Previous experience at an AI safety or research organization</li>\n<li>Deep expertise across every tool listed above , familiarity with a subset and a willingness to learn is enough</li>\n</ul>\n<p><strong>Logistics</strong></p>\n<p>Minimum education: Bachelor’s degree or an equivalent combination of education, training, and/or experience Required field of study: A field relevant to the role as demonstrated through coursework, training, or professional experience Minimum years of experience: Years of experience required will correlate with the internal job level requirements for the position Location-based hybrid policy: Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices. Visa sponsorship: We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>\n<p><strong>How we&#39;re different</strong></p>\n<p>We believe that the highest-impact AI research will be big science. At Anthropic we work as a single cohesive team on just a few large-scale research efforts. And we value impact , advancing our long-term goals of steerable, trustworthy AI , rather than work on smaller and more specific puzzles. We view AI research as an empirical science, which has as much in common with physics and biology as with traditional efforts in computer science. We&#39;re an extremely collaborative group, and we host frequent research discussions to ensure that we are pursuing the highest-impact work at any given time. As such, we greatly value communication skills. The easiest way to understand our research directions is to read our recent research. This research continues many of the directions our team worked on prior to Anthropic, including: GPT-3, Circuit-Based Interpretability, Multimodal Neurons, Scaling Laws, AI &amp; Compute, Concrete Problems in AI Safety, and Learning from Human Preferences.</p>\n<p><strong>Come work with us!</strong></p>\n<p>Anthropic is a public benefit corporation headquartered in San Francisco. We offer competitive compensation and benefits, optional equity donation matching, generous vacation and parental leave, flexible working hours, and a lovely office space in which to collaborate with colleagues.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b68ff4cc-e74","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.com/","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5156057008","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"£170,000-£220,000 GBP","x-skills-required":["SQL","Python","ETL/ELT pipelines","dbt","Airflow","Spark","cloud data platforms","BigQuery","Redshift","Snowflake","Looker","Tableau","Metabase"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:59:33.960Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, UK"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, ETL/ELT pipelines, dbt, Airflow, Spark, cloud data platforms, BigQuery, Redshift, Snowflake, Looker, Tableau, Metabase","baseSalary":{"@type":"MonetaryAmount","currency":"GBP","value":{"@type":"QuantitativeValue","minValue":170000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_04c1ff49-2d1"},"title":"Data Platform Solutions Architect (Professional Services)","description":"<p>We&#39;re hiring for multiple roles within our Professional Services team. As a Data Platform Solutions Architect, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>Extensive experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Design and deployment of performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n<li>Documentation and white-boarding skills.</li>\n<li>Experience working with clients and managing conflicts.</li>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n<li>Travel to customers 10% of the time</li>\n</ul>\n<p>[Preferred] Databricks Certification but not essential</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_04c1ff49-2d1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8396801002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","technical project delivery","documentation and white-boarding skills"],"x-skills-preferred":["Databricks Certification"],"datePosted":"2026-04-18T15:58:52.546Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, United Kingdom"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, technical project delivery, documentation and white-boarding skills, Databricks Certification"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_273f3f27-7de"},"title":"Staff Product Manager, Content Experience","description":"<p>We&#39;re looking for a Product Manager to lead our Content Experience strategy. In this role, you will own how users discover, learn from, and act on content: across documentation, in-product help, AI-assisted guidance, and beyond.</p>\n<p>You&#39;ll help define the future of content at Databricks by making it a first-class product experience and integrating content development into product development.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Owning the content experience end-to-end, ensuring it&#39;s helpful, intuitive, and actionable</li>\n<li>Driving strategic improvements to content tooling and workflows</li>\n<li>Building an architecture of participation that enables context experts to contribute directly to content</li>\n<li>Integrating AI to transform content experiences</li>\n<li>Defining metrics that matter and tracking content engagement, time-to-task, support deflection, and user satisfaction</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>7+ years of product management experience, with a proven track record of leading cross-functional initiatives and delivering high-impact user experiences</li>\n<li>Deep understanding of developer tools, data platforms, or technical products with large surface areas</li>\n<li>Strong systems mindset, comfortable designing scalable workflows, content architectures, and tooling integrations</li>\n<li>Experience with developer documentation, content platforms, or product onboarding is a plus</li>\n<li>Strong customer empathy and an obsession with helping users succeed</li>\n<li>Familiarity with AI technologies (especially LLMs) and how they can be applied to content workflows and user guidance</li>\n<li>Experience working with technical and non-technical contributors in a collaborative content ecosystem</li>\n</ul>\n<p>Pay Range Transparency: Databricks is committed to fair and equitable compensation practices. The pay range for this role is $181,700-$249,800 USD.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_273f3f27-7de","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8040989002","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$181,700-$249,800 USD","x-skills-required":["Product Management","Content Strategy","AI Technologies","Developer Tools","Data Platforms","Technical Products"],"x-skills-preferred":["LLMs","Content Platforms","Product Onboarding"],"datePosted":"2026-04-18T15:58:39.949Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Product Management, Content Strategy, AI Technologies, Developer Tools, Data Platforms, Technical Products, LLMs, Content Platforms, Product Onboarding","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":181700,"maxValue":249800,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5b244f27-9fd"},"title":"Resident Solutions Architect - Communications, Media, Entertainment & Games","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases. You will work with engagement managers to scope variety of professional services work with input from the customer.</p>\n<p>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications. Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</p>\n<p>Provide an escalated level of support for customer operational issues. You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</p>\n<p>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</p>\n<p>The ideal candidate will have 6+ years experience in data engineering, data platforms &amp; analytics, comfortable writing code in either Python or Scala, working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one, deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals, familiarity with CI/CD for production deployments, working knowledge of MLOps, design and deployment of performant end-to-end data architectures, experience with technical project delivery - managing scope and timelines, documentation and white-boarding skills, experience working with clients and managing conflicts, build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</p>\n<p>Travel to customers 20% of the time.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5b244f27-9fd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8461258002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:58:34.588Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Raleigh, North Carolina"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0fe57d9a-28e"},"title":"Engagement Manager","description":"<p>Job Title: Engagement Manager</p>\n<p>We are seeking an experienced Engagement Manager to join our team in Tokyo. As an Engagement Manager, you will be responsible for driving customer success by ensuring that our customers are making the most value of our products and services.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Collaborate with sales counterparts to understand customer needs and develop valued solutions</li>\n<li>Identify opportunities for new services and articulate the business value</li>\n<li>Perform as the Engagement Manager in the assigned area with full accountability for meeting/exceeding Professional Services and Training bookings and revenue targets</li>\n<li>Consult with clients to understand and analyze engagement scope, requirements, time, cost, and benefits</li>\n<li>Drive resolution of delivery challenges, address resource contentions, scoping issues, and manage expectations</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Strong fundamental knowledge around Big Data Platforms Implementation from both the technology, operations, and security/governance lenses</li>\n<li>Proven experience in selling services offerings at either implementation, advisory, education, and change management capacity</li>\n<li>Senior customer-facing roles that require a mix of influencing, validating, negotiating, understanding, and execution to both business and technology audiences</li>\n<li>Consistent track record of identifying customer needs and successfully implementing solutions</li>\n<li>Owning projects/programs in agile scrum/kanban delivery methodology as well as waterfall methodology</li>\n<li>Strong problem-solving skill about customer&#39;s pain points by using modern technologies</li>\n<li>Excellence in presentation skills, providing proposals that enforce good project governance and drive scalable delivery practices to both internal and external executives</li>\n<li>High-level orchestration skills to align both internal and external stakeholders when proposing large initiatives</li>\n<li>Strong service delivery and program management skills with the ability to synthesize customer success outcomes into well-structured program plans that deliver against such outcomes</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Prior experience in project/program proposal to customers at Consulting, SI, Software/Cloud Vendor</li>\n<li>Bachelor&#39;s degree in Computer Science or related educational background</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Comprehensive benefits and perks that meet the needs of all employees</li>\n</ul>\n<p>Commitment to Diversity and Inclusion:</p>\n<ul>\n<li>Databricks is committed to fostering a diverse and inclusive culture where everyone can excel</li>\n</ul>\n<p>Compliance:</p>\n<ul>\n<li>Access to export-controlled technology or source code is required for performance of job duties, and it is within Employer&#39;s discretion whether to apply for a U.S. government license for such positions</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0fe57d9a-28e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8501186002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Big Data Platforms Implementation","Customer Success","Project Management","Program Management","Agile Scrum/Kanban Delivery Methodology","Waterfall Methodology","Problem-Solving Skill","Presentation Skills","Project Governance","Service Delivery"],"x-skills-preferred":["Prior Experience in Project/Program Proposal to Customers at Consulting, SI, Software/Cloud Vendor","Bachelor's Degree in Computer Science or Related Educational Background"],"datePosted":"2026-04-18T15:58:15.078Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Tokyo, Japan"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Big Data Platforms Implementation, Customer Success, Project Management, Program Management, Agile Scrum/Kanban Delivery Methodology, Waterfall Methodology, Problem-Solving Skill, Presentation Skills, Project Governance, Service Delivery, Prior Experience in Project/Program Proposal to Customers at Consulting, SI, Software/Cloud Vendor, Bachelor's Degree in Computer Science or Related Educational Background"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_f1950023-ef7"},"title":"Senior Engineering Manager, Activation","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Brex’s AI-native automation and world-class service eliminate manual expense and accounting tasks for customers so they can focus on what matters most. Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream. We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Engineering</p>\n<p>Engineering at Brex is about building systems that scale with speed and intention. Our teams span Software, Data, Security, and IT, and operate with high autonomy and deep collaboration. We tackle hard technical problems, own our outcomes, and push for excellence at every level , from architecture to deployment. It’s an environment where engineering is a craft, and builders become leaders.</p>\n<p>What you’ll do</p>\n<p>You will lead an engineering group focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, integrations, and implementation workflows that help customers realize value quickly. This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>\n<p>The ideal candidate is a seasoned engineering leader with experience scaling user-facing onboarding systems, delivering high-quality product experiences, and partnering deeply across Product, Design, Operations, and GTM teams.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our New York office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities</p>\n<ul>\n<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding and implementation experience.</li>\n</ul>\n<ul>\n<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>\n</ul>\n<ul>\n<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>\n</ul>\n<ul>\n<li>Drive execution of the Activation roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>\n</ul>\n<ul>\n<li>Lead and manage multiple teams of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>\n</ul>\n<ul>\n<li>Build systems that integrate identity verification, KYC and compliance workflows, customer data ingestion, and implementation tooling in a scalable and reliable manner.</li>\n</ul>\n<ul>\n<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>\n</ul>\n<ul>\n<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>\n</ul>\n<ul>\n<li>Strong technical background and understanding of software development principles.</li>\n</ul>\n<ul>\n<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>\n</ul>\n<ul>\n<li>Demonstrated track record of shipping customer-facing features across multiple release cycles.</li>\n</ul>\n<ul>\n<li>3+ years of experience managing or leading multiple technical teams in a high-growth environment.</li>\n</ul>\n<ul>\n<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>\n</ul>\n<ul>\n<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer facing experiences.</li>\n</ul>\n<ul>\n<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>\n</ul>\n<ul>\n<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>\n</ul>\n<p>Bonus points</p>\n<ul>\n<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>\n</ul>\n<ul>\n<li>You have started your own technology venture or were an early technical founder/employee. We value entrepreneurial spirit &amp; scrappiness!</li>\n</ul>\n<ul>\n<li>You are a champion for the customer and constantly put yourself in their shoes to create intuitive, frictionless experiences.</li>\n</ul>\n<p>Compensation</p>\n<p>The expected salary range for this role is $300,000 - $375,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_f1950023-ef7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8330492002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$300,000 - $375,000","x-skills-required":["Technical leadership","Software development principles","Full-stack engineering","Customer-facing features","Data-driven mindset","AI-powered product experiences","LLM-driven automation","Personalization"],"x-skills-preferred":["Data platforms","Snowflake","Hex","Entrepreneurial spirit","Scrappiness","Customer obsession"],"datePosted":"2026-04-18T15:57:39.757Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Technical leadership, Software development principles, Full-stack engineering, Customer-facing features, Data-driven mindset, AI-powered product experiences, LLM-driven automation, Personalization, Data platforms, Snowflake, Hex, Entrepreneurial spirit, Scrappiness, Customer obsession","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":300000,"maxValue":375000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_22bcbb50-ef4"},"title":"Member of Technical Staff - Data Platform","description":"<p><strong>About the Role</strong></p>\n<p>The Data Platform team at xAI builds and operates the infrastructure responsible for all large-scale data transport and processing across the company.</p>\n<p>As a software engineer on the Data Platform team, you will design, build, and operate the distributed systems powering X&#39;s data movement and compute.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design and implement high-throughput, low-latency data ingestion and transport systems.</li>\n<li>Scale and optimise multi-tenant Kafka infrastructure supporting real-time workloads.</li>\n<li>Extend and tune Spark, Flink, and Trino for demanding production pipelines.</li>\n<li>Build interfaces, APIs, and pipelines enabling teams to query, process, and move data at petabyte scale.</li>\n<li>Debug and optimise distributed systems, with a focus on reliability and performance under load.</li>\n<li>Collaborate with ML, product, and infrastructure teams to unblock critical data workflows.</li>\n</ul>\n<p><strong>Basic Qualifications</strong></p>\n<ul>\n<li>Proven expertise in distributed systems, stream processing, or large-scale data platforms.</li>\n<li>Proficiency in Rust, Go, Scala or similar systems languages.</li>\n<li>Hands-on experience with Kafka, Flink, Spark, Trino, or Hadoop in production.</li>\n<li>Strong debugging, profiling, and performance optimisation skills.</li>\n<li>Track record of shipping and maintaining critical infrastructure.</li>\n<li>Comfortable working in fast-moving, high-stakes environments with minimal guardrails.</li>\n</ul>\n<p><strong>Compensation and Benefits</strong></p>\n<p>$180,000 - $440,000 USD</p>\n<p>Base salary is just one part of our total rewards package at X, which also includes equity, comprehensive medical, vision, and dental coverage, access to a 401(k) retirement plan, short &amp; long-term disability insurance, life insurance, and various other discounts and perks.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_22bcbb50-ef4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"xAI","sameAs":"https://www.x.com/","logo":"https://logos.yubhub.co/x.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/xai/jobs/4803862007","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$180,000 - $440,000 USD","x-skills-required":["Rust","Go","Scala","Kafka","Flink","Spark","Trino","Hadoop","distributed systems","stream processing","large-scale data platforms"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:57:30.705Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Palo Alto, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Rust, Go, Scala, Kafka, Flink, Spark, Trino, Hadoop, distributed systems, stream processing, large-scale data platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":440000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_477d343e-e37"},"title":"Customer Success Architect","description":"<p>About Mixpanel</p>\n<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence. Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees.</p>\n<p>About the Customer Success Team:</p>\n<p>Mixpanel’s Customer Success &amp; Solutions Engineering teams are analytics consultants who embed themselves within our enterprise customer teams to drive our customers’ business outcomes. We work with prospects and customers throughout the customer journey to understand what drives value and serve as the technical counterpart to our Sales organization to deliver on that value.</p>\n<p>You will partner closely with Account Executives, Account Managers, Product, Engineering, and Support to successfully roll out self-serve analytics within our customers’ organizations, help the customer manage change, execute on technical projects and services that delight our customers, and ultimately drive ROI on the customer’s Mixpanel investment.</p>\n<p>About the Role:</p>\n<p>As a CSA, you will partner with customers throughout the customer journey to understand what drives value, beginning from the pre-sales running proof of concepts to demonstrate quick time to value, to post-sales onboarding and implementation, where you set customers up for long-term success with scalable implementation and data governance best practices. Throughout the entire customer lifecycle, you will work to understand how analytics can drive business value for your customers and will consult them on how to maximize the value of Mixpanel, including managing change during Mixpanel’s rollout, defining and achieving ROI, and identifying areas of improvement in their current usage of analytics.</p>\n<p>For large enterprise customers, post onboarding, you will also continue alongside the Account Managers to drive data trust and product adoption for 100+ end user teams through a change management rollout approach.</p>\n<p>Responsibilities:</p>\n<p>Serve as a trusted technical advisor for prospects/customers to provide strategic consultation on data architecture, governance, instrumentation, and business outcomes</p>\n<p>Effectively communicate at most levels of the customer’s organization to influence business outcomes via Mixpanel, design and execute a comprehensive analytics strategy, and unblock technical and organizational roadblocks</p>\n<p>Own the customer’s success with Mixpanel , documenting and delivering ROI to the customer throughout their journey to transform their business with self-serve analytics</p>\n<p>Own onboarding and data health for your assigned customers/projects, including ongoing enhancements to their data quality and overall tech stack integration</p>\n<p>Engage with customers’ engineering, product management, and marketing teams to handle technical onboarding, optimize Mixpanel deployments, and improve data trust</p>\n<p>Deliver a variety of technical services ranging from data architecture consultations to adoption and change management best practices</p>\n<p>Leverage modern data architecture expertise to create scalable data governance practices and data trust for our customers, including data optimization and re-implementation projects</p>\n<p>Successfully execute on success outcomes whilst balancing project timelines, scope creep, and unanticipated issues</p>\n<p>Bridge the technical-business gap with your customers , working with business stakeholders to define a strategic vision for Mixpanel and then working with the right business and technical contacts to execute that vision</p>\n<p>Collaborate with our technical and solutions partners as needed on data optimization and onboarding projects</p>\n<p>Be a technical sponsor for internal engagements with Mixpanel product and engineering teams to prioritize product and systems tasks from clients</p>\n<p>We&#39;re Looking For Someone Who Has</p>\n<p>3 to 5 years of experience consulting on defining and delivering ROI through new tool implementations</p>\n<p>Experience working with Director-level members of the customer organization to define a strategic vision and successfully leveraging those members to deliver on that vision</p>\n<p>The ability to communicate with stakeholders at most levels of an organization , from talking with developers about the ins and outs of an API to talking to a Director of Data Science/Product Management about organizational efficiency</p>\n<p>Can manage complex projects with assorted client stakeholders, working across teams and departments to execute real change</p>\n<p>Has a demonstrated successful record of experience in customer success, client-facing professional services, consulting, or technical project management role</p>\n<p>Excellent written, analytical, and communication skills</p>\n<p>Strong process and/or project delivery discipline</p>\n<p>Eager to learn new technologies and adapt to evolving customer needs</p>\n<p>We&#39;d Be Extra Excited For Someone Who Has</p>\n<p>Experience in data querying, modeling, and transforming in at least one core tool, including SQL / dbt / Python / Business Intelligence tools / Product Analytics tools, etc.</p>\n<p>Familiar with databases and cloud data warehouses like Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, etc.</p>\n<p>Familiar with product analytics implementation methods like SDKs, Customer Data Platforms (CDPs), Event Streaming, Reverse ETL, etc.</p>\n<p>Familiar with analytics best practices across business segments and verticals</p>\n<p>Benefits and Perks</p>\n<p>Comprehensive Medical, Vision, and Dental Care</p>\n<p>Mental Wellness Benefit</p>\n<p>Generous Vacation Policy &amp; Additional Company Holidays</p>\n<p>Enhanced Parental Leave</p>\n<p>Volunteer Time Off</p>\n<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>\n<p>Culture Values</p>\n<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>\n<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>\n<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>\n<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>\n<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>\n<p>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</p>\n<p>Why choose Mixpanel?</p>\n<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>\n<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>\n<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>\n<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>\n<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>\n<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>\n<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>\n<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, or any other protected characteristic.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_477d343e-e37","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mixpanel","sameAs":"https://mixpanel.com","logo":"https://logos.yubhub.co/mixpanel.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mixpanel/jobs/7506821","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data architecture","governance","instrumentation","business outcomes","data querying","modeling","transforming","SQL","dbt","Python","Business Intelligence tools","Product Analytics tools"],"x-skills-preferred":["databases","cloud data warehouses","Google Cloud","Amazon Redshift","Microsoft Azure","Snowflake","Databricks","SDKs","Customer Data Platforms","Event Streaming","Reverse ETL"],"datePosted":"2026-04-18T15:57:25.195Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru, India (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data architecture, governance, instrumentation, business outcomes, data querying, modeling, transforming, SQL, dbt, Python, Business Intelligence tools, Product Analytics tools, databases, cloud data warehouses, Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, SDKs, Customer Data Platforms, Event Streaming, Reverse ETL"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_17cafbc1-d4e"},"title":"Named Core Account Executive","description":"<p>Want to help solve the world&#39;s toughest problems with data and AI?</p>\n<p>This is what we do every day at Databricks.</p>\n<p>We are looking for a creative, delivery-oriented Enterprise Account Executive to join the team in the Nordics to maximise the phenomenal market opportunity that exists for Databricks.</p>\n<p>As an Account Executive, you know how to sell innovation and value to existing customers, identify new use cases and grow consumption and can guide deals forward to compress decision cycles.</p>\n<p>You love understanding a product in depth and are passionate about communicating its value to customers and partners.</p>\n<p>You will be offered huge potential for career progression with the pace of the team&#39;s growth.</p>\n<p>You will report to the Sales Director for Sweden.</p>\n<p>The impact you will have:</p>\n<p>Assess your territory and develop a successful execution strategy Drive quarter-on-quarter consumption growth in existing accounts Exceed activity and quarterly revenue targets Track all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce Identify new use case opportunities and showcase value to existing customers Promote the value of the Databricks&#39; Data Intelligence Platform Orchestrate and utilise our field engineering teams to ensure valuable outcomes for clients Build and demonstrate value with all engagements to guide successful negotiations to close point</p>\n<p>What we look for:</p>\n<p>Extensive understanding of the data platform, open source and cloud ecosystems Highly skilled in prospecting research and ability to map out key stakeholders Demonstrated success in Value Selling and developing a mutual action plan Ability to influence decision-making and strategy with customer leadership teams Ability to establish credibility with the C-suite Adept in selling to technical buyers Experience growing consumption and closing commit deals in a direct sales role Mastery of MEDDPICC Bachelor&#39;s Degree or relevant work experience Fluency in English and in Swedish is required</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_17cafbc1-d4e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8402154002","x-work-arrangement":"onsite","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data platform","open source","cloud ecosystems","prospecting research","key stakeholders","value selling","mutual action plan","influence decision-making","customer leadership teams","credibility with C-suite","selling to technical buyers","consumption growth","quarterly revenue targets","salesforce","field engineering teams","valuable outcomes"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:57:22.755Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Stockholm, Sweden"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"data platform, open source, cloud ecosystems, prospecting research, key stakeholders, value selling, mutual action plan, influence decision-making, customer leadership teams, credibility with C-suite, selling to technical buyers, consumption growth, quarterly revenue targets, salesforce, field engineering teams, valuable outcomes"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_52e9ea6f-e2a"},"title":"Enterprise Account Executive","description":"<p>The Enterprise Account Executive will report to the Director of Enterprise GTM and own revenue growth across a portfolio of Scale AI&#39;s largest and most strategic enterprise customers. This role is focused on selling complex, highly technical AI solutions into F500 organisations, partnering with executive, technical, and operational stakeholders to drive long-term value and expansion.</p>\n<p>You will be responsible for full-cycle enterprise sales - from prospecting and deal strategy through close, renewal, and expansion - while serving as the quarterback across internal teams including Solutions Engineering, Product, Research, and Operations. This role requires strong ownership, executive presence, and the ability to navigate multi-stakeholder enterprise buying processes in a fast-paced environment.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Own and drive relationships with Scale&#39;s largest and most complex Fortune 500 prospects and customers</li>\n<li>Build trusted relationships with executive, technical, and operational stakeholders across multiple business units</li>\n<li>Develop and execute comprehensive account strategies to drive net-new revenue, expansion, and long-term partnerships</li>\n<li>Lead strategic deal planning and mutual close plans across new business, renewals, and expansions</li>\n<li>Partner closely with Solutions Engineering and Product teams to deliver compelling, technically credible value propositions</li>\n<li>Act as the voice of the customer internally, influencing product roadmap, research priorities, and delivery execution</li>\n<li>Maintain deep understanding of customer business goals, AI maturity, and industry trends to proactively identify opportunities</li>\n<li>Consistently communicate account health, pipeline, and forecast accuracy using Salesforce, Clari, and related tools</li>\n</ul>\n<p>Ideally, You Will Have:</p>\n<ul>\n<li>8+ years of enterprise sales or account management experience, including 2+ years selling deeply technical solutions to both business and technical audiences</li>\n<li>A proven track record of closing and expanding large, complex enterprise deals</li>\n<li>Demonstrated success consistently achieving or exceeding quota in enterprise sales roles</li>\n<li>Experience building and executing long-term account strategies to drive sustained revenue growth</li>\n<li>Strong ability to lead enterprise renewal processes from strategy through close</li>\n<li>Excellent written and verbal communication skills, with comfort presenting to executive audiences</li>\n<li>Strong command of enterprise sales processes and systems (Salesforce, Clari, Outreach, Slack)</li>\n<li>A consultative, customer-first mindset with the ability to influence cross-functional internal teams</li>\n<li>Experience developing executive-level materials and business cases</li>\n<li>Strong project management, organisational skills, and attention to detail</li>\n<li>Technical background or strong technical curiosity highly valued, especially familiarity with AI, ML, or data platforms</li>\n</ul>\n<p>Sales Commission:</p>\n<p>This role is eligible to earn commissions. Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training.</p>\n<p>Benefits:</p>\n<p>Comprehensive health, dental and vision coverage, retirement benefits, a learning and development stipend, and generous PTO.</p>\n<p>Additional benefits may include a commuter stipend.</p>\n<p>Salary Range:</p>\n<p>$207,200 - $259,000 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_52e9ea6f-e2a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale AI","sameAs":"https://www.scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4646946005","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$207,200 - $259,000 USD","x-skills-required":["Enterprise sales","Account management","Technical sales","Executive presence","Communication skills","Project management","Organisational skills","Attention to detail","AI","ML","Data platforms"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:57:15.121Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Enterprise sales, Account management, Technical sales, Executive presence, Communication skills, Project management, Organisational skills, Attention to detail, AI, ML, Data platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":207200,"maxValue":259000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b3cf0ff9-4c6"},"title":"Support Engineer II","description":"<p>About Mixpanel</p>\n<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>\n<p>Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees. Mixpanel delivers insights that customers trust.</p>\n<p>Visit mixpanel.com to learn more.</p>\n<p>About The Support Team</p>\n<p>Mixpanel Support is a team of talented problem-solvers from diverse backgrounds. We care deeply about helping our customers be successful and enabling them to get value from their data.</p>\n<p>We are located all over the world in San Francisco, Barcelona, London, and Singapore...</p>\n<p>About The Role</p>\n<p>The right candidate is an avid learner, an advocate for customers, and a collaborative teammate. The main responsibility of a Support Engineer is to help users solve technical challenges and use Mixpanel to make impactful product decisions.</p>\n<p>We’ve had team members focus on developing their technical skills to join the product and engineering teams, hone their customer-facing skills to become customer success managers or sales engineers, and take on leadership roles in the Support organization.</p>\n<p>Responsibilities</p>\n<p>The core responsibility of a Support Engineer is to support our customers at every turn in the Mixpanel journey by providing answers to product questions, sharing best practices, and debugging technical issues.</p>\n<p>You&#39;ll also develop your technical skills, collaborate with our Product team to improve our product, learn product analytics, and mentor new team members.</p>\n<p>Become a Mixpanel product expert - you will help users understand our reports and features, help them use our APIs and SDKs, share best practices, and resolve account issues</p>\n<p>Respond to customer inquiries via Zendesk email, chat, Slack, and phone calls</p>\n<p>Investigate and document bugs and feature requests to share with our Product and Engineering teams</p>\n<p>Provide feedback regarding internal support processes, product functionality, and customer education resources to improve the customer experience</p>\n<p>Shape the product by regularly working closely with PM’s, engineers, and designers to incorporate customer learnings into change</p>\n<p>We&#39;re Looking For Someone Who Has</p>\n<p>Experience providing customer facing SAAS support (in customer support, professional services, technical account management or similar)</p>\n<p>Ability to communicate technical concepts effectively in a clear, friendly writing style</p>\n<p>Excellent problem-solving and analytical skills</p>\n<p>Programming experience, understanding of web &amp; mobile technologies, and interacting with APIs</p>\n<p>Experience with debugging and collaborating with engineering to resolve complex technical issues, especially with JavaScript, Python, or mobile technologies</p>\n<p>Ability to be resourceful and resilient when faced with ambiguity and new challenges</p>\n<p>Dedication to developing expertise in a complex and constantly evolving product</p>\n<p>Interest and aptitude to develop technical skills and learn new technologies</p>\n<p>Experience providing SLA based support and/or dedicated support to strategic customers</p>\n<p>Speak Hebrew and fluent English</p>\n<p>Bonus Points</p>\n<p>Experience with Mixpanel or other analytics tools</p>\n<p>Familiar with databases and cloud data warehouses like Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, etc.</p>\n<p>Familiar with product analytics implementation methods like SDKs, Customer Data Platforms (CDPs), Event Streaming, Reverse ETL, etc.</p>\n<p>Benefits and Perks</p>\n<p>Comprehensive Medical, Vision, and Dental Care</p>\n<p>Mental Wellness Benefit</p>\n<p>Generous Vacation Policy &amp; Additional Company Holidays</p>\n<p>Enhanced Parental Leave</p>\n<p>Volunteer Time Off</p>\n<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>\n<p>Culture Values</p>\n<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>\n<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>\n<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>\n<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>\n<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>\n<p>Why choose Mixpanel?</p>\n<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>\n<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>\n<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>\n<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>\n<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>\n<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>\n<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>\n<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, veteran status, or disability status.</p>\n<p>Pursuant to the San Francisco Fair Chance Ordinance or other similar laws that may be applicable, we will consider for employment qualified applicants with arrest and conviction records.</p>\n<p>We’ve immersed ourselves in our Culture and Values as our guiding principles for the impact we want to have and the future we are building.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b3cf0ff9-4c6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mixpanel","sameAs":"https://mixpanel.com","logo":"https://logos.yubhub.co/mixpanel.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mixpanel/jobs/7650541","x-work-arrangement":"hybrid","x-experience-level":null,"x-job-type":"full-time","x-salary-range":null,"x-skills-required":["customer facing SAAS support","technical concepts","problem-solving","programming experience","web & mobile technologies","APIs","debugging","collaboration","SLA based support","dedicated support","Hebrew","English"],"x-skills-preferred":["Mixpanel","analytics tools","databases","cloud data warehouses","product analytics implementation methods","SDKs","Customer Data Platforms","Event Streaming","Reverse ETL"],"datePosted":"2026-04-18T15:57:10.436Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Tel Aviv, Israel (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"customer facing SAAS support, technical concepts, problem-solving, programming experience, web & mobile technologies, APIs, debugging, collaboration, SLA based support, dedicated support, Hebrew, English, Mixpanel, analytics tools, databases, cloud data warehouses, product analytics implementation methods, SDKs, Customer Data Platforms, Event Streaming, Reverse ETL"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6b0c92b4-c05"},"title":"Sr. Manager, Field Engineering","description":"<p>We are looking for a dynamic Sr. Manager, Field Engineering to lead a team of Solution Architects within our Retail vertical. As a key member of our Field Engineering team, you will be responsible for helping customers in the Consumer Goods segment succeed with Databricks and providing outsized value to their businesses. You will also be responsible for maintaining a robust hiring pipeline, establishing relationships across the business, and partnering with sales leadership to hit sales and consumption targets.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Hiring, training, growing, and managing a team of Solutions Architects</li>\n<li>Making customers in the Consumer Goods segment successful with Databricks and providing outsized value to their businesses</li>\n<li>Maintaining a robust hiring pipeline at all times</li>\n<li>Establishing relationships across the business to make customers and team successful</li>\n<li>Partnering with sales leadership to hit sales and consumption targets</li>\n</ul>\n<p>The ideal candidate will have 7+ years of professional experience in the data space with a technical product, 3+ years of experience in the field, architecting and delivering data-driven solutions for major accounts within the Retail, Consumer Products, Travel &amp; Hospitality vertical, and a deep familiarity with the buy-side and supply-side ecosystem.</p>\n<p>In addition, the candidate should have demonstrated expertise with data collaboration ecosystem, 3+ years of experience building and leading technical pre-sales teams, a deep technical understanding of the impact that Data + AI can drive within the Retail industry, and trusted advisor to technical executives who guide strategic data infrastructure decisions.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6b0c92b4-c05","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8362888002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$192,100-$264,175 USD","x-skills-required":["data warehousing","big data","machine learning","data collaboration ecosystem","customer data platforms","clean rooms","data marketplaces"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:56:40.724Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data warehousing, big data, machine learning, data collaboration ecosystem, customer data platforms, clean rooms, data marketplaces","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":192100,"maxValue":264175,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2f962d3f-14e"},"title":"Resident Solutions Architect - Communications, Media, Entertainment & Games","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n</ul>\n<ul>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Databricks Certification</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2f962d3f-14e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8461218002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems","Apache Spark","CI/CD","MLOps","performant end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:56:09.899Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dallas, Texas"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_995b724b-c85"},"title":"Senior Sales Engineer, Partnerships","description":"<p>We are seeking a Senior Sales Engineer, Partnerships to join our team. As a Senior Sales Engineer, you will be responsible for providing technical expertise and strategic enablement to partners, facilitating strategies to pursue avenues of revenue outside of the life sciences. This role bridges technical knowledge and business strategy, supporting partners during discovery, qualification, and solution design to showcase the value of Komodo&#39;s healthcare data and analytics platform.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Serve as a technical lead on 8-10 multiple strategic opportunities, directly influencing the deal cycles and accelerating revenue growth.</li>\n<li>Become the definitive subject matter expert on Komodo&#39;s comprehensive suite of healthcare data assets and platform capabilities.</li>\n<li>Garner subject matter expertise and ownership of a segment within the Partnerships / Channel Partnerships organization.</li>\n<li>Develop scalable technical frameworks, demo environments, and reusable assets that have set new organizational standards with a heavy emphasis on agentic AI workflows.</li>\n<li>Drive cross-functional initiatives by partnering with Product, Data Science, and Engineering to deliver customized, innovative solutions.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>7+ years of experience in Sales Engineering or Solutions Engineering with a focus on healthcare data and healthcare technology.</li>\n<li>Proven track record of understanding and leveraging AI tools to enhance SaaS products or improve operational workflows.</li>\n<li>Expertise in healthcare data (e.g., 837/835 transactions, NDC codes) and its practical applications in analytics, reporting, and decision-making.</li>\n<li>Strong technical skills, including experience with APIs, data integration, cloud-based architectures (e.g., AWS, Azure), and analyzing large datasets.</li>\n<li>An understanding and proficiency of data science techniques, specifically SQL, Python, and/or R.</li>\n<li>Excellent communication and presentation skills, with the ability to train partners and translate complex technical concepts for diverse stakeholders.</li>\n</ul>\n<p>Preferred Skills:</p>\n<ul>\n<li>Experience working within the provider, payer, or financial service segments.</li>\n<li>Technical certifications in AWS, Azure, or data platforms.</li>\n<li>Experience with CRM platforms like Salesforce for managing partner and client interactions.</li>\n<li>Familiarity with data visualization tools (e.g., Tableau, Looker) to create impactful partner training materials.</li>\n<li>Knowledge of identity resolution and privacy-preserving linking technologies.</li>\n<li>Prior experience developing joint business plans and co-sell strategies with channel partners.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_995b724b-c85","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Komodo Health","sameAs":"https://www.komodohealth.com/","logo":"https://logos.yubhub.co/komodohealth.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/komodohealth/jobs/8495825002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$143,000-$193,000 USD","x-skills-required":["Sales Engineering","Healthcare Data","Healthcare Technology","AI Tools","APIs","Data Integration","Cloud-Based Architectures","Data Science Techniques","SQL","Python","R","Excellent Communication","Presentation Skills"],"x-skills-preferred":["Experience Working Within Provider, Payer, or Financial Service Segments","Technical Certifications in AWS, Azure, or Data Platforms","Experience with CRM Platforms Like Salesforce","Familiarity with Data Visualization Tools","Knowledge of Identity Resolution and Privacy-Preserving Linking Technologies","Prior Experience Developing Joint Business Plans and Co-Sell Strategies"],"datePosted":"2026-04-18T15:56:08.142Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Healthcare","skills":"Sales Engineering, Healthcare Data, Healthcare Technology, AI Tools, APIs, Data Integration, Cloud-Based Architectures, Data Science Techniques, SQL, Python, R, Excellent Communication, Presentation Skills, Experience Working Within Provider, Payer, or Financial Service Segments, Technical Certifications in AWS, Azure, or Data Platforms, Experience with CRM Platforms Like Salesforce, Familiarity with Data Visualization Tools, Knowledge of Identity Resolution and Privacy-Preserving Linking Technologies, Prior Experience Developing Joint Business Plans and Co-Sell Strategies","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":143000,"maxValue":193000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_64cc7147-8b9"},"title":"Sr. Manager, Technical Solutions","description":"<p>We are looking for a Senior Technical Solutions Manager to grow, lead and manage the Technical solutions engineers and support teams in India. The Senior Technical Solutions Manager is responsible for building and managing a regional team of technical experts focused on resolving highly complex and long-running support tickets raised by Databricks customers while overseeing the Support operations.</p>\n<p>Impact you will have:</p>\n<ul>\n<li>Build and manage a team of Technical Solution Engineers</li>\n<li>Provide coaching and mentorship to the engineers</li>\n<li>Identify and implement process improvements to meet or exceed regional performance KPIs.</li>\n<li>Establish training plans and subject matter expertise within the team.</li>\n<li>Drive support escalations and establish cross-functional collaboration to manage and resolve issues.</li>\n<li>Be a player-coach and provide technical leadership to the regional support team.</li>\n<li>Coordinate with Sales and field teams to address account-level concerns and drive adoption and usage of the Databricks platform.</li>\n<li>Define quarterly goals and track them to completion to drive team growth and personal development.</li>\n<li>Scale the organisation by developing processes and guidelines that promote operational efficiency</li>\n<li>Demonstrate a true sense of ownership and coordinate action items with engineering and escalation teams to achieve timely resolution of customer issues.</li>\n<li>Perform risk assessments and be a hands-on leader</li>\n</ul>\n<p>What we are looking for:</p>\n<ul>\n<li>Minimum 15 years of experience in the Tech Industry</li>\n<li>SaaS Support, building, testing, and maintaining</li>\n<li>Minimum 6+ years of managerial experience, leading a team of at least 6+ technical support engineers</li>\n<li>Proven experience working with cloud native applications/SaaS (AWS, Azure, GCP), big data platforms, or Apache Spark™ in a technical capacity</li>\n<li>Demonstrated experience in a customer-facing role managing a large regional team of technical support engineers.</li>\n<li>Excellent analytical and troubleshooting skills.</li>\n<li>Excellent customer facing, verbal and written communication skills</li>\n<li>A team-oriented attitude and a high degree of comfort working in a startup environment</li>\n<li>Hands-on experience in systems troubleshooting, networking, and Linux fundamentals, JVM troubleshooting, debugging of Java applications is preferred</li>\n</ul>\n<p>Benefits</p>\n<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_64cc7147-8b9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8341135002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SaaS Support","Cloud Native Applications","Big Data Platforms","Apache Spark","Java","Linux","Networking"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:55:50.362Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SaaS Support, Cloud Native Applications, Big Data Platforms, Apache Spark, Java, Linux, Networking"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0036f074-845"},"title":"Resident Solutions Architect - Financial Services","description":"<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n<li>Provide an escalated level of support for customer operational issues.</li>\n<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>9+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Capable of design and deployment of highly performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n<li>Documentation and white-boarding skills.</li>\n<li>Experience working with clients and managing conflicts.</li>\n<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>\n<li>Travel to customers up to 20% of the time</li>\n</ul>\n<p>Nice to have: Databricks Certification</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0036f074-845","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8456966002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","design and deployment of highly performant end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:55:41.870Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Boston, Massachusetts"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, design and deployment of highly performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3b6af70e-6ba"},"title":"Field Engineering, Data Warehousing Product Specialist","description":"<p>As the Data Warehousing Product Specialist in the Field Engineering team, you will be defining and driving our technical go-to-market strategy in the Asia Pacific and Japan region.</p>\n<p>You will be directly working with customers to guide and influence their data warehousing architecture and decisions. You will be acting as a trusted advisor for senior executives and your in-depth technical knowledge will ensure our customers are successful in leveraging Databricks to solve their business problems.</p>\n<p>You will be the technical expert to support our field engineering teams internally and you will be expected to help enable the team to understand the key differentiators of our product against our competitors.</p>\n<p>You will partner with the Product Manager(s) to help to define the product direction based on local knowledge and inform our product strategy with our go-to-market field teams.</p>\n<p>You will not have any direct reports but will recruit and lead a group of specialists across the field dedicated to scale your impact.</p>\n<p>You will also be a thought leader externally to the market via speaking at conferences, online webinars, and blog posts.</p>\n<p>You will meet with customers to communicate the vision and gather feedback.</p>\n<p>You have expertise in cloud-based data warehousing technologies, modern data platform architectures, traditional data warehousing techniques, and preferably industry domain knowledge.</p>\n<p>You will excel in creating and articulating a compelling value proposition for our customers and enabling Account Executives and Field Engineers to operate effectively using best practices and assets that you own and develop.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Lead the vision and strategy for Data Warehousing Adoption in Asia Pacific and Japan</li>\n</ul>\n<ul>\n<li>Develop materials for our GTM team to effectively communicate our value proposition</li>\n</ul>\n<ul>\n<li>Support our field teams in key strategic pursuit opportunities</li>\n</ul>\n<ul>\n<li>Lead a team of subject matter experts (SMEs) to scale your impact in local regions and enable the broader field team to have confidence in competing in the market</li>\n</ul>\n<ul>\n<li>Act as the thought leader to build confidence and relationships with our customers at the executive level</li>\n</ul>\n<ul>\n<li>Present externally at conferences, events, and webinars and publish content to establish our position in the market</li>\n</ul>\n<ul>\n<li>Work with post sales and partners to establish migration best practices</li>\n</ul>\n<ul>\n<li>Act as a cross-functional representative with Product Management, Product Marketing, and Engineering on our go-to-market motion</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3b6af70e-6ba","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8151567002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data warehousing","cloud-based data warehousing technologies","modern data platform architectures","traditional data warehousing techniques","industry domain knowledge"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:55:35.031Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Singapore"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data warehousing, cloud-based data warehousing technologies, modern data platform architectures, traditional data warehousing techniques, industry domain knowledge"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c6a4bf88-cba"},"title":"Engineering Manager, Spark Connect","description":"<p>We are building the world&#39;s best data and AI infrastructure platform so our customers can solve the world&#39;s toughest problems. The Spark Platform organisation builds the core technologies that power Databricks and the Apache Spark ecosystem. We are looking for an Engineering Manager to lead the Spark Connect team.</p>\n<p>This leader will own the Databricks Spark Connect platform, drive reliability and execution for a critical Serverless component, and lead our open source Spark Connect strategy in Apache Spark. This includes owning the OSS roadmap, partnering across teams, and driving adoption of Spark Connect in the open source ecosystem.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Lead the team responsible for Spark Connect, a critical component of Databricks Serverless.</li>\n<li>Own platform reliability, scalability, and operational excellence.</li>\n<li>Drive Databricks&#39; technical leadership in open source Spark Connect.</li>\n<li>Define and execute the strategy for OSS Spark Connect adoption and ecosystem growth.</li>\n<li>Partner across product, runtime, serverless, and OSS stakeholders to shape roadmap and architecture.</li>\n<li>Hire and grow a strong engineering team with high technical and operational standards.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>10+ years of experience in distributed systems, infrastructure, or data platforms.</li>\n<li>3+ years of experience managing high-performing engineering teams.</li>\n<li>Strong technical depth and a track record of owning critical production systems.</li>\n<li>Experience leading cross-functional initiatives and influencing technical direction.</li>\n<li>Passion for open source, developer platforms, and large-scale systems.</li>\n</ul>\n<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipates utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 2 Pay Range $180,500-$248,150 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c6a4bf88-cba","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com/","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8502969002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,500-$248,150 USD","x-skills-required":["distributed systems","infrastructure","data platforms","engineering management","open source","developer platforms","large-scale systems"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:55:33.462Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bellevue, Washington"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"distributed systems, infrastructure, data platforms, engineering management, open source, developer platforms, large-scale systems","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180500,"maxValue":248150,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e68e5c3b-1e2"},"title":"Lakebase Account Executive","description":"<p>We are seeking a Lakebase Account Executive to help customers modernize their operational data foundation with Databricks Lakebase, our fully-managed Postgres offering for intelligent applications.</p>\n<p>As a Lakebase Account Executive, you will drive new Lakebase revenue by identifying, qualifying, and closing Lakebase opportunities within a defined territory, in partnership with regional Account Executives and the broader account team.</p>\n<p>You will lead with outcomes for key Lakebase personas , including platform teams and developers, data teams, and central IT , articulating how Lakebase helps them ship features faster, simplify operational data architectures, and improve governance and cost efficiency.</p>\n<p>You will sell the value of fully-managed Postgres for intelligent applications, positioning Lakebase as the optimal choice for operational workloads that power real-time, AI-driven experiences.</p>\n<p>You will run complex, multi-threaded sales cycles from discovery and value hypothesis through commercial negotiation and close, navigating executive, technical, and line-of-business stakeholders.</p>\n<p>You will orchestrate proof-of-value and POCs that validate Lakebase’s benefits for OLTP-style workloads, reverse ETL, and AI/ML-driven applications, in partnership with solution architects and specialists.</p>\n<p>You will compete and win against legacy and cloud-native operational databases by leveraging our compete assets, benchmarks, and customer references.</p>\n<p>You will align to measurable business outcomes such as performance, developer productivity, time-to-market for new features, cost reduction, and simplification of the operational data landscape.</p>\n<p>You will partner cross-functionally with Product Management, Marketing, Customer Success, and Partner teams to shape territory plans, launch plays, and co-selling motions with key ISVs and GSIs.</p>\n<p>You will enable the field by sharing Lakebase best practices, success stories, and sales motions with broader sales teams, helping scale Lakebase proficiency across the organization.</p>\n<p>This role requires the ability to operate across two key motions simultaneously:</p>\n<p>Establish top strategic focus accounts by engaging application development teams to create net-new intelligent applications leveraging Lakebase.</p>\n<p>Drive longer-term Postgres standardization and migration within Databricks&#39; most strategic accounts.</p>\n<p>Candidates should demonstrate how they can act as a force multiplier across multiple dimensions of the business.</p>\n<p>Success in this role requires strength in four areas:</p>\n<p>Business ownership – Operate at a business-unit level by tracking revenue, pipeline, and key observations, and by identifying areas needing additional focus or support.</p>\n<p>Strategic account engagement – Partner with account teams to engage priority accounts across the global DB700, driving strategic opportunities from initial engagement through successful outcomes.</p>\n<p>Field enablement – Build and execute enablement plans that empower AEs and SAs to confidently carry the Lakebase conversation even when the specialist is not present.</p>\n<p>Market voice and thought leadership – Develop an internal and external presence by contributing to global AMAs and internal forums, and by representing Databricks at key first- and third-party events.</p>\n<p>The interview process is designed to evaluate candidates across all four of these dimensions.</p>\n<p>We are looking for a candidate with 7+ years of enterprise SaaS sales experience, consistently exceeding quota in complex, multi-stakeholder deals.</p>\n<p>Proven success selling data platforms, operational databases (e.g., Postgres, MySQL, cloud-native DBaaS), or adjacent data/AI infrastructure to technical buyers and business leaders.</p>\n<p>Strong understanding of modern data and application architectures, including cloud-native services, microservices, event-driven systems, and how operational data underpins AI and analytics strategies.</p>\n<p>Ability to sell to both technical stakeholders (developers, architects, data engineers) and business stakeholders (product leaders, operations, line-of-business owners).</p>\n<p>Demonstrated experience leading specialist or overlay motions, working jointly with core Account Executives to create and progress opportunities.</p>\n<p>Executive presence with the ability to whiteboard architectures, lead C-level conversations, and build trust with senior decision makers.</p>\n<p>Strong value selling skills: adept at discovering pain, building a business case, and tying technical capabilities to clear, quantified outcomes.</p>\n<p>Excellent communication, storytelling, and negotiation skills, with comfort presenting to both large and small audiences.</p>\n<p>Bachelor’s degree or equivalent practical experience.</p>\n<p>Preferred qualifications include experience selling Postgres, operational databases, OLTP workloads, or transactional cloud database services, ideally within large or strategic accounts.</p>\n<p>Familiarity with data platforms, lakehouse architectures, and cloud ecosystems (AWS, Azure, GCP), including how operational databases fit within broader data and AI strategies.</p>\n<p>Understanding of reverse ETL, real-time decisioning, and operational analytics use cases, and how they drive value for customer-facing and internal applications.</p>\n<p>Exposure to AI-native and agent-driven applications that depend on low-latency, highly scalable operational data services.</p>\n<p>Prior experience in a high-growth, category-creating environment, helping shape new plays, messaging, and customer narratives.</p>\n<p>Experience collaborating with partners and ISVs to drive joint pipeline and co-sell motions.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e68e5c3b-1e2","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8449848002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Postgres","operational databases","OLTP workloads","transactional cloud database services","data platforms","lakehouse architectures","cloud ecosystems","reverse ETL","real-time decisioning","operational analytics","AI-native applications","agent-driven applications","low-latency","highly scalable operational data services"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:55:06.106Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Singapore"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Postgres, operational databases, OLTP workloads, transactional cloud database services, data platforms, lakehouse architectures, cloud ecosystems, reverse ETL, real-time decisioning, operational analytics, AI-native applications, agent-driven applications, low-latency, highly scalable operational data services"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c6d97a39-7f0"},"title":"Senior Engineering Manager, Activation","description":"<p>Job Title: Senior Engineering Manager, Activation</p>\n<p>Join us at Brex, the intelligent finance platform that enables companies to spend smarter and move faster. We&#39;re looking for a seasoned engineering leader to lead our engineering group focused on building the systems and product experiences that power customer activation at Brex.</p>\n<p>As a Senior Engineering Manager, you will be responsible for driving business and product strategies, collaborating with cross-functional partners, leveraging AI to reimagine and automate onboarding and implementation workflows, and leading and managing multiple teams of engineers.</p>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s or Master&#39;s degree in Computer Science, Engineering, or a related field</li>\n<li>Strong technical background and understanding of software development principles</li>\n<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences</li>\n<li>Demonstrated track record of shipping customer-facing features across multiple release cycles</li>\n<li>3+ years of experience managing or leading multiple technical teams in a high-growth environment</li>\n</ul>\n<p>Bonus points:</p>\n<ul>\n<li>Experience with data platforms such as Snowflake, Hex, or similar</li>\n<li>You have started your own technology venture or were an early technical founder/employee</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $300,000 - $375,000. However, the starting base pay will depend on a number of factors including the candidate&#39;s location, skills, experience, market demands, and internal pay parity.</p>\n<p>If you&#39;re a champion for the customer and constantly put yourself in their shoes to create intuitive, frictionless experiences, we&#39;d love to hear from you!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c6d97a39-7f0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8330487002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$300,000 - $375,000","x-skills-required":["Leadership","Software Development","AI","Data Platforms","Full-Stack Engineering"],"x-skills-preferred":["Snowflake","Hex"],"datePosted":"2026-04-18T15:54:31.577Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Leadership, Software Development, AI, Data Platforms, Full-Stack Engineering, Snowflake, Hex","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":300000,"maxValue":375000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3a17bc01-d7d"},"title":"Staff Software Engineer","description":"<p>DBT Labs is seeking a Staff Software Engineer to join our Engineering team. As a seasoned engineer, you will architect and build the durable memory substrate that powers agentic analytics workflows. This platform stores not just metadata, but meaning: decisions, intent, rationale, and history , and makes it safely accessible to humans, agents, and applications.</p>\n<p>Your responsibilities will include:</p>\n<ul>\n<li>Prototyping apt technical solutions and finding best fits for the context engine.</li>\n<li>Architecting and building the core Context Platform.</li>\n<li>Designing schemas and primitives for Decision Memory and enterprise context.</li>\n<li>Owning context storage systems (graph, vector, event/time-based).</li>\n<li>Building read/write/query APIs used by agents, products, and external apps.</li>\n<li>Designing permission-aware, auditable context access.</li>\n</ul>\n<p>You will be working closely with agentic systems engineers and product leadership to ensure the context engine is interoperable, portable, and zero-lock-in by design.</p>\n<p>In this role, you will own:</p>\n<ul>\n<li>Context schemas and schema evolution strategies.</li>\n<li>Storage and data modeling choices.</li>\n<li>Platform APIs and interfaces.</li>\n<li>Security, identity propagation, and audit foundations.</li>\n<li>Long-term scalability and correctness of context data.</li>\n</ul>\n<p>You will not own:</p>\n<ul>\n<li>Agent behavior or orchestration logic.</li>\n<li>Business rules or governance policy decisions.</li>\n<li>Product UI or workflow automation.</li>\n</ul>\n<p>The ideal candidate will have significant experience building distributed systems, data platforms, or infrastructure, and will be comfortable operating in ambiguous, greenfield problem spaces. They will also have deep expertise in data modeling and schema design, experience designing shared platforms used by many teams, and strong instincts around APIs, contracts, and backward compatibility.</p>\n<p>Nice to have experience with knowledge graphs, metadata systems, or search/retrieval systems, experience building systems with governance, auditability, or compliance requirements, and familiarity with dbt or modern analytics stacks or developer tooling.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3a17bc01-d7d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4661362005","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Distributed systems","Data platforms","Infrastructure","Data modeling","Schema design","APIs","Contracts","Backward compatibility","Knowledge graphs","Metadata systems","Search/retrieval systems"],"x-skills-preferred":["dbt","Modern analytics stacks","Developer tooling"],"datePosted":"2026-04-18T15:54:01.444Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"India - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Distributed systems, Data platforms, Infrastructure, Data modeling, Schema design, APIs, Contracts, Backward compatibility, Knowledge graphs, Metadata systems, Search/retrieval systems, dbt, Modern analytics stacks, Developer tooling"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_58a44dab-91a"},"title":"Partner Solutions Architect - Japan","description":"<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across Japan. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>You will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud.</p>\n<p>Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing. This is not a purely reactive enablement role. The Partner SA is expected to help shape and execute repeatable partner plays that create revenue.</p>\n<p>That includes enabling partner sellers and architects, supporting account mapping and seller-to-seller engagement, helping define joint value propositions, supporting partner-led pipeline generation, and influencing product and field strategy based on what is learned in-market.</p>\n<p>Internal operating docs show this motion consistently includes enablement sessions, QBR sponsorships, account planning, workshops, field events, and targeted campaigns designed to produce sourced and influenced pipeline.</p>\n<p>You&#39;ll be part of a team helping dbt scale its ecosystem through better partner capability, tighter field alignment, and more repeatable pipeline generation. The role is especially important as dbt continues investing in structured partner motions and deeper engagement with major cloud and data platform partners.</p>\n<p>What you&#39;ll do:</p>\n<ul>\n<li>Partner closely with Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n</ul>\n<ul>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n</ul>\n<ul>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n</ul>\n<ul>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n</ul>\n<ul>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n</ul>\n<ul>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n</ul>\n<ul>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n</ul>\n<ul>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n</ul>\n<ul>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n</ul>\n<ul>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n</ul>\n<ul>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<ul>\n<li>Travel approximately 30-40% to support partner planning, enablement, executive meetings, and field events across Japan</li>\n</ul>\n<p>This scope reflects how the Partner SA team is already operating: enabling partner field teams, building account-level alignment, supporting QBRs and regional events, and translating those activities into sourced and engaged pipeline.</p>\n<p>What you&#39;ll need:</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n</ul>\n<ul>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n</ul>\n<ul>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n</ul>\n<ul>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n</ul>\n<ul>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n</ul>\n<ul>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n</ul>\n<ul>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n</ul>\n<ul>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n</ul>\n<ul>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n</ul>\n<ul>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out:</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n</ul>\n<ul>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n</ul>\n<ul>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n</ul>\n<ul>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n</ul>\n<ul>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n</ul>\n<ul>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n</ul>\n<ul>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n</ul>\n<ul>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>What to expect in the interview process (all video interviews unless accommodations are needed):</p>\n<ul>\n<li>Interview with Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Interview with Hiring Manager</li>\n</ul>\n<ul>\n<li>Team Interviews</li>\n</ul>\n<ul>\n<li>Demo Round</li>\n</ul>\n<p>#LI-LA1</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_58a44dab-91a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673657005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner engineering","customer-facing technical role"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:53:29.744Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Japan - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner engineering, customer-facing technical role, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c2aaf7ac-804"},"title":"Security Engineer - Threat Detection","description":"<p><strong>Job Description</strong></p>\n<p>You will design, build, and maintain detections that identify malicious activity across Stripe&#39;s infrastructure, applications, and cloud environments.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design, build, and tune high-fidelity detections across modern SIEM platforms, covering adversary TTPs across the full attack lifecycle</li>\n<li>Develop detection hypotheses by researching TTPs, identifying evidence sources, and determining detection opportunities across available telemetry</li>\n<li>Conduct hypothesis-driven threat hunts to identify malicious activity, uncover detection gaps, and validate security controls</li>\n<li>Perform malware analysis and reverse engineering to extract indicators and inform detection strategies</li>\n<li>Build network-based detections (flow, pcap, protocol analysis) and endpoint-based detections (event logs, EDR telemetry, memory/file artifacts) across Windows, Linux, and macOS</li>\n<li>Partner with Threat Intelligence to operationalize intel reports into detections, hunting leads, and enrichment logic</li>\n<li>Collaborate with IR, SOC, and offensive security teams to validate and refine detections based on real-world incidents and red team exercises</li>\n<li>Build data pipelines, automation, and tooling that enable detection-as-code practices and scalable deployment</li>\n<li>Map detection coverage to MITRE ATT&amp;CK, identifying and prioritizing gaps across key attack surfaces</li>\n<li>Lead projects, mentor teammates, and champion quality standards within the team</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>5+ years of experience in detection engineering, threat hunting, or security operations</li>\n<li>Demonstrated experience writing detection logic in modern SIEM platforms (e.g., Splunk, Chronicle, Elastic, CrowdStrike NG-SIEM, Panther, Microsoft Sentinel)</li>\n<li>Strong understanding of adversary tradecraft across the attack lifecycle: initial access, privilege escalation, lateral movement, defense evasion, persistence, and exfiltration</li>\n<li>Ability to extract TTPs from threat intelligence reports and translate them into detection opportunities</li>\n<li>Experience developing network-based and endpoint-based detections across multiple OS platforms (Windows, Linux, macOS)</li>\n<li>Experience analyzing telemetry across endpoint, network, cloud (AWS/GCP/Azure), identity, and application log sources</li>\n<li>Proficiency in detection/query languages (SPL, KQL, EQL, YARA-L, SQL) and programming (Python or similar)</li>\n<li>Strong communication skills with the ability to document detection logic and explain findings to technical and non-technical audiences</li>\n<li>Adversarial mindset , understanding how attackers operate to build detections that catch real-world threats</li>\n</ul>\n<p><strong>Preferred Qualifications</strong></p>\n<ul>\n<li>Experience in detection engineering or threat hunting within fintech, financial services, or highly regulated environments</li>\n<li>Background in malware analysis, reverse engineering, or threat research</li>\n<li>Experience with purple team operations , collaborating with offensive security to validate detections</li>\n<li>Familiarity with big data platforms (Databricks, Trino, PySpark) for large-scale log analysis</li>\n<li>Proficiency with AI/LLM-assisted development tools (Claude Code, Cursor, GitHub Copilot) applied to detection workflows</li>\n<li>Interest in agentic automation , using LLMs to augment hunting, tuning, or triage</li>\n<li>Experience with detection validation tools (Atomic Red Team, ATT&amp;CK Evaluations)</li>\n<li>Contributions to open-source detection content, research, or conference presentations</li>\n<li>Relevant certifications such as HTB CDSA, GCIH, GCFA, GNFA, OSCP, TCM PMAT, or GREM</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c2aaf7ac-804","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Stripe","sameAs":"https://stripe.com/","logo":"https://logos.yubhub.co/stripe.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/stripe/jobs/7827230","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["detection engineering","threat hunting","security operations","SIEM platforms","adversary tradecraft","network-based detections","endpoint-based detections","telemetry analysis","detection/query languages","programming","communication skills"],"x-skills-preferred":["fintech","financial services","malware analysis","reverse engineering","purple team operations","big data platforms","AI/LLM-assisted development tools","agentic automation","detection validation tools","open-source detection content","relevant certifications"],"datePosted":"2026-04-18T15:53:27.161Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Ireland"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"detection engineering, threat hunting, security operations, SIEM platforms, adversary tradecraft, network-based detections, endpoint-based detections, telemetry analysis, detection/query languages, programming, communication skills, fintech, financial services, malware analysis, reverse engineering, purple team operations, big data platforms, AI/LLM-assisted development tools, agentic automation, detection validation tools, open-source detection content, relevant certifications"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4ea7999b-3d8"},"title":"Resident Solutions Architect - Healthcare & Life Sciences","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n<li>Provide an escalated level of support for customer operational issues.</li>\n<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Design and deployment of performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n<li>Documentation and white-boarding skills.</li>\n<li>Experience working with clients and managing conflicts.</li>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Databricks Certification</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4ea7999b-3d8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8494145002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems","Apache Spark","CI/CD","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:53:02.737Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Austin, Texas"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d6421dea-6e3"},"title":"Strategic Hunter Account Executive - Lakebase","description":"<p>We are seeking a Strategic Hunter Account Executive to help customers modernize their operational data foundation with Databricks Lakebase, our fully-managed Postgres offering for intelligent applications.</p>\n<p>This high-impact role sits within the Lakebase Go-To-Market team and partners closely with regional Account Executives to drive adoption of Lakebase with platform, application, and data teams.</p>\n<p>Lakebase gives customers a unified, governed foundation for operational workloads and AI-native applications, helping them move away from a fragmented estate of point databases toward a modern, scalable, serverless Postgres service.</p>\n<p>If you want to be at the forefront of operational databases for AI and intelligent applications at one of the fastest-growing data and AI companies in the world, this is your opportunity.</p>\n<p><strong>The impact you will have</strong></p>\n<ul>\n<li>Drive new Lakebase revenue by identifying, qualifying, and closing Lakebase opportunities within a defined territory, in partnership with regional Account Executives and the broader account team.</li>\n</ul>\n<ul>\n<li>Lead with outcomes for key Lakebase personas , including platform teams and developers, data teams, and central IT , articulating how Lakebase helps them ship features faster, simplify operational data architectures, and improve governance and cost efficiency.</li>\n</ul>\n<ul>\n<li>Sell the value of fully-managed Postgres for intelligent applications, positioning Lakebase as the optimal choice for operational workloads that power real-time, AI-driven experiences.</li>\n</ul>\n<ul>\n<li>Run complex, multi-threaded sales cycles from discovery and value hypothesis through commercial negotiation and close, navigating executive, technical, and line-of-business stakeholders.</li>\n</ul>\n<ul>\n<li>Orchestrate proof-of-value and POCs that validate Lakebase’s benefits for OLTP-style workloads, reverse ETL, and AI/ML-driven applications, in partnership with solution architects and specialists.</li>\n</ul>\n<ul>\n<li>Compete and win against legacy and cloud-native operational databases by leveraging our compete assets, benchmarks, and customer references.</li>\n</ul>\n<ul>\n<li>Align to measurable business outcomes such as performance, developer productivity, time-to-market for new features, cost reduction, and simplification of the operational data landscape.</li>\n</ul>\n<ul>\n<li>Partner cross-functionally with Product Management, Marketing, Customer Success, and Partner teams to shape territory plans, launch plays, and co-selling motions with key ISVs and GSIs.</li>\n</ul>\n<ul>\n<li>Enable the field by sharing Lakebase best practices, success stories, and sales motions with broader sales teams, helping scale Lakebase proficiency across the organization.</li>\n</ul>\n<p><strong>What success looks like in this role</strong></p>\n<p>This role requires the ability to operate across two key motions simultaneously:</p>\n<ul>\n<li>Establish top strategic focus accounts by engaging application development teams to create net-new intelligent applications leveraging Lakebase.</li>\n</ul>\n<ul>\n<li>Drive longer-term Postgres standardization and migration within Databricks&#39; most strategic accounts.</li>\n</ul>\n<p>Candidates should demonstrate how they can act as a force multiplier across multiple dimensions of the business.</p>\n<p>Success in this role requires strength in four areas:</p>\n<ul>\n<li>Business ownership – Operate at a business-unit level by tracking revenue, pipeline, and key observations, and by identifying areas needing additional focus or support.</li>\n</ul>\n<ul>\n<li>Strategic account engagement – Partner with account teams to engage priority accounts across the global DB700, driving strategic opportunities from initial engagement through successful outcomes.</li>\n</ul>\n<ul>\n<li>Field enablement – Build and execute enablement plans that empower AEs and SAs to confidently carry the Lakebase conversation even when the specialist is not present.</li>\n</ul>\n<p>Market voice and thought leadership – Develop an internal and external presence by contributing to global AMAs and internal forums, and by representing Databricks at key first- and third-party events.</p>\n<p><strong>What we look for</strong></p>\n<ul>\n<li>7+ years of enterprise SaaS sales experience, consistently exceeding quota in complex, multi-stakeholder deals.</li>\n</ul>\n<ul>\n<li>Proven success selling data platforms, operational databases (e.g., Postgres, MySQL, cloud-native DBaaS), or adjacent data/AI infrastructure to technical buyers and business leaders.</li>\n</ul>\n<ul>\n<li>Strong understanding of modern data and application architectures, including cloud-native services, microservices, event-driven systems, and how operational data underpins AI and analytics strategies.</li>\n</ul>\n<ul>\n<li>Ability to sell to both technical stakeholders (developers, architects, data engineers) and business stakeholders (product leaders, operations, line-of-business owners).</li>\n</ul>\n<ul>\n<li>Demonstrated experience leading specialist or overlay motions, working jointly with core Account Executives to create and progress opportunities.</li>\n</ul>\n<ul>\n<li>Executive presence with the ability to whiteboard architectures, lead C-level conversations, and build trust with senior decision makers.</li>\n</ul>\n<ul>\n<li>Strong value selling skills: adept at discovering pain, building a business case, and tying technical capabilities to clear, quantified outcomes.</li>\n</ul>\n<ul>\n<li>Excellent communication, storytelling, and negotiation skills, with comfort presenting to both large and small audiences.</li>\n</ul>\n<ul>\n<li>Bachelor’s degree or equivalent practical experience.</li>\n</ul>\n<p><strong>Preferred qualifications</strong></p>\n<ul>\n<li>Experience selling Postgres, operational databases, OLTP workloads, or transactional cloud database services, ideally within large or strategic accounts.</li>\n</ul>\n<ul>\n<li>Familiarity with data platforms, lakehouse architectures, and cloud ecosystems (AWS, Azure, GCP), including how operational databases fit within broader data and AI strategies.</li>\n</ul>\n<ul>\n<li>Understanding of reverse ETL, real-time decisioning, and operational analytics use cases, and how they drive value for customer-facing and internal applications.</li>\n</ul>\n<ul>\n<li>Exposure to AI-native and agent-driven applications that depend on low-latency, highly scalable operational data services.</li>\n</ul>\n<ul>\n<li>Prior experience in a high-growth, category-creating environment, helping shape new plays, messaging, and customer narratives.</li>\n</ul>\n<ul>\n<li>Experience collaborating with partners and ISVs to drive joint pipeline and co-sell motions.</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please click here.</p>\n<p><strong>Our Commitment to Diversity and Inclusion</strong></p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d6421dea-6e3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8477547002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data platforms","operational databases","Postgres","MySQL","cloud-native DBaaS","data/AI infrastructure","technical buyers","business leaders","modern data and application architectures","cloud-native services","microservices","event-driven systems","AI and analytics strategies","technical stakeholders","business stakeholders","value selling skills","discovering pain","building a business case","quantified outcomes","communication","storytelling","negotiation skills"],"x-skills-preferred":["OLTP workloads","transactional cloud database services","lakehouse architectures","cloud ecosystems","reverse ETL","real-time decisioning","operational analytics use cases","AI-native applications","agent-driven applications","high-growth environments","category-creating environments","partner collaborations","ISV collaborations"],"datePosted":"2026-04-18T15:52:47.849Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru, India; Mumbai, India"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data platforms, operational databases, Postgres, MySQL, cloud-native DBaaS, data/AI infrastructure, technical buyers, business leaders, modern data and application architectures, cloud-native services, microservices, event-driven systems, AI and analytics strategies, technical stakeholders, business stakeholders, value selling skills, discovering pain, building a business case, quantified outcomes, communication, storytelling, negotiation skills, OLTP workloads, transactional cloud database services, lakehouse architectures, cloud ecosystems, reverse ETL, real-time decisioning, operational analytics use cases, AI-native applications, agent-driven applications, high-growth environments, category-creating environments, partner collaborations, ISV collaborations"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_47807ca3-e36"},"title":"Strategic AI/BI Account Executive","description":"<p>We are seeking a Strategic AI/BI Account Executive to help enterprise customers transform how business users interact with data. This high-impact role sits within the AI Go-To-Market team and partners closely with Enterprise Account Executives to drive adoption of Databricks AI/BI and Genie in APJ.</p>\n<p>You will help organisations move beyond static dashboards to governed, conversational, AI-powered analytics at the centre of the convergence of business intelligence, data platforms, and generative AI. Enterprise analytics is rapidly evolving from dashboards and static reporting to conversational, AI-driven decision platforms. Databricks AI/BI and Genie empower business users to securely interact with governed data using natural language, transforming the data platform into a true decision platform.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Partner with Enterprise AEs to identify, qualify, and close AI/BI opportunities</li>\n<li>Engage C-level, analytics, and line-of-business leaders to modernise analytics strategies</li>\n<li>Displace or expand legacy BI platforms with AI-powered, governed analytics solutions</li>\n<li>Lead conversations around semantic governance, self-service analytics, and natural language data access</li>\n<li>Drive proof-of-value engagements and scale enterprise-wide adoption</li>\n<li>Align AI/BI initiatives to measurable business outcomes (productivity, speed to insight, revenue impact)</li>\n<li>Enable field teams and serve as a subject matter expert on modern analytics architectures</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>Enterprise sales experience in BI, analytics, data platforms, or AI/ML</li>\n<li>Strong understanding of modern analytics architectures and data governance</li>\n<li>Ability to sell to both technical and business stakeholders</li>\n<li>Executive presence and experience navigating complex buying cycles</li>\n<li>Passion for AI and the impact of GenAI on enterprise analytics</li>\n<li>Experience operating in a specialist or overlay sales model</li>\n<li>Ability to translate technical capabilities into clear business value</li>\n<li>7+ years of Enterprise Sales experience, exceeding quotas in larger accounts</li>\n</ul>\n<p>Preferred qualifications include:</p>\n<ul>\n<li>Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot</li>\n<li>Familiarity with semantic layers, metrics stores, or governed data models</li>\n<li>Understanding of lakehouse architectures and cloud data platforms</li>\n<li>Exposure to GenAI, natural language interfaces, or conversational applications</li>\n<li>Consulting or solution design experience in customer-facing roles</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_47807ca3-e36","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8441884002","x-work-arrangement":"onsite","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Enterprise sales experience in BI, analytics, data platforms, or AI/ML","Strong understanding of modern analytics architectures and data governance","Ability to sell to both technical and business stakeholders","Executive presence and experience navigating complex buying cycles","Passion for AI and the impact of GenAI on enterprise analytics"],"x-skills-preferred":["Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot","Familiarity with semantic layers, metrics stores, or governed data models","Understanding of lakehouse architectures and cloud data platforms","Exposure to GenAI, natural language interfaces, or conversational applications","Consulting or solution design experience in customer-facing roles"],"datePosted":"2026-04-18T15:52:23.856Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Singapore"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Enterprise sales experience in BI, analytics, data platforms, or AI/ML, Strong understanding of modern analytics architectures and data governance, Ability to sell to both technical and business stakeholders, Executive presence and experience navigating complex buying cycles, Passion for AI and the impact of GenAI on enterprise analytics, Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot, Familiarity with semantic layers, metrics stores, or governed data models, Understanding of lakehouse architectures and cloud data platforms, Exposure to GenAI, natural language interfaces, or conversational applications, Consulting or solution design experience in customer-facing roles"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_276f3a05-2e9"},"title":"Field CTO - America Industries","description":"<p>We are seeking a Field Chief Technology Officer (Field CTO) for the Americas Industries Business Unit to be a senior, customer-facing technology and business transformation thought leader for our most strategic, often global, accounts in regulated industries.</p>\n<p>This individual contributor role sits at the intersection of data and AI strategy, industry transformation, and executive relationship-building, working closely with C-level leaders to drive multi-year change on the data platform while representing real-world needs back into Databricks.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Building and maintaining trusted-advisor relationships with C-level executives in large US-based and global accounts, especially in highly regulated industries.</li>\n<li>Cultivating a strong social and professional network across customer executives, boards, key industry bodies, and partners.</li>\n<li>Shaping executive thinking on modern data and AI architectures, with emphasis on Lakehouse and data platform modernization as the primary lever for long-term Gen AI impact.</li>\n<li>Leading C-level briefings, strategy sessions, and multi-day workshops that connect business outcomes, regulatory constraints, and operating model change to concrete Databricks-based roadmaps.</li>\n<li>Serving as a deep technical counterpart in the field, maintaining L200–L300 proficiency across Databricks products and being able to credibly engage architects, data engineers, and data scientists on solution design and trade-offs.</li>\n<li>Generalizing patterns from the field into reusable reference architectures, industry blueprints, and best practices for regulated industries, and sharing them through blogs, webinars, whitepapers, and conference keynotes.</li>\n<li>Orchestrating the broader ecosystem (cloud providers, GSIs, consultancies, ISVs) around customer objectives, ensuring Databricks is at the center of multi-year transformation programs rather than isolated projects.</li>\n<li>Partnering with Account Executives, Solutions Architects, Industry Leads, and Product Specialists to drive complex, multi-year sales cycles, securing platform decisions and expansions while influencing ACV and consumption growth.</li>\n<li>Providing structured, prioritized feedback from strategic customers into Product, Engineering, and Field leadership to influence product roadmap, especially around data, governance, security, and regulated-industry requirements.</li>\n<li>Mentoring senior Field Engineering and industry-focused talent, contributing to a pipeline of principal- and CTO-level leaders and codifying ways of working for complex, regulated accounts.</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>15+ years of experience spanning enterprise technology and consulting, including leading or advising on multi-year data platform and analytics transformations in large, complex organizations.</li>\n<li>Significant time spent inside a large enterprise software or cloud company in roles that required navigating matrixed organizations and driving change at scale, combined with direct industry exposure rather than a career spent solely in horizontal software.</li>\n<li>Experience in or with regulated industries, with familiarity with regulatory and compliance considerations affecting data and AI platforms.</li>\n<li>A background that blends hands-on technology and architecture work on data platforms and analytics, organizational and operating model change, executive consulting or advisory, and proven ability to operate as a highly credible peer to C-level executives.</li>\n<li>Strong, proactive networker who is naturally curious about which associations, councils, and forums matter for a given customer set, and who uses those networks to create new executive entry points and opportunities.</li>\n<li>Demonstrated longevity and impact in prior roles, with evidence of building and sustaining long-term customer relationships and programs rather than frequent short stints.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_276f3a05-2e9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8306218002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$249,800-$343,400 USD","x-skills-required":["data and AI strategy","industry transformation","executive relationship-building","Lakehouse and data platform modernization","Gen AI impact","L200–L300 proficiency across Databricks products","solution design and trade-offs","reference architectures","industry blueprints","best practices for regulated industries","cloud providers","GSIs","consultancies","ISVs","complex, multi-year sales cycles","platform decisions and expansions","ACV and consumption growth","product roadmap","data governance","security","regulated-industry requirements"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:51:59.028Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data and AI strategy, industry transformation, executive relationship-building, Lakehouse and data platform modernization, Gen AI impact, L200–L300 proficiency across Databricks products, solution design and trade-offs, reference architectures, industry blueprints, best practices for regulated industries, cloud providers, GSIs, consultancies, ISVs, complex, multi-year sales cycles, platform decisions and expansions, ACV and consumption growth, product roadmap, data governance, security, regulated-industry requirements","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":249800,"maxValue":343400,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_f560b1d5-028"},"title":"Senior Digital Programs Manager","description":"<p>We are seeking an innovative and operationally-minded Digital Programs Manager to design, build, and maintain the programs, operations, and automation infrastructure that optimize the customer experience at scale and drive operational efficiency across all segments.</p>\n<p>This critical role is focused on maximizing retention by delivering a seamless, valuable, and consistent service through a hybrid digital and human approach, directly improving product adoption and customer engagement.</p>\n<p>You will establish a digital-first baseline of automated touchpoints for all scaled (downmarket) customers, complete with clear, data-driven escalation paths to human support for complex issues.</p>\n<p>Simultaneously, you will deliver workflows and automation that enable our Customer Success Architects to work faster and smarter.</p>\n<p>The ideal candidate thrives at the intersection of process, technology, and customer experience, and will be responsible for creating the playbooks and automations required to service a large volume of customers effectively and efficiently.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Define and execute the comprehensive digital Customer Experience (CX) strategy to align with overall business objectives, maximize customer value, and proactively address the needs of the scaled segment.</li>\n</ul>\n<ul>\n<li>Architect and deploy efficient and effective digital workflows for core customer journeys, including standardized customer onboarding and continuous lifecycle engagement programs.</li>\n</ul>\n<ul>\n<li>Manage end-to-end digital programs (e.g., onboarding, adoption campaigns, renewal notifications) tailored for the scaled customer segment.</li>\n</ul>\n<ul>\n<li>Design and execute campaigns using digital channels (email, Slack, webinars, etc) to drive feature adoption and sustained product engagement.</li>\n</ul>\n<ul>\n<li>Continuously test, measure, and iterate on program performance to improve conversion rates, customer satisfaction scores (CSAT), and other key performance indicators (KPIs).</li>\n</ul>\n<ul>\n<li>Design the escalation logic and scoring models that trigger human intervention from automated sequences.</li>\n</ul>\n<ul>\n<li>Support the growth of our business by automating workflow elements for our higher-touch Enterprise team, such as programmatically identifying and flagging customer risk and surfacing high-value upsell opportunities.</li>\n</ul>\n<ul>\n<li>Create and document clear, repeatable operations playbooks and Standard Operating Procedures (SOPs) for key digital customer journeys.</li>\n</ul>\n<ul>\n<li>Serve as the primary liaison, working with Customer Success, Product, Sales, and Marketing teams to ensure alignment, gather requirements, and guarantee the effective execution of all digital CX initiatives.</li>\n</ul>\n<ul>\n<li>Collaborate with our data engineering and ops teams to ensure data cleanliness and segmentation accuracy within our customer systems to enable highly targeted and personalized digital outreach.</li>\n</ul>\n<ul>\n<li>Own, track, and analyze key program metrics and operational KPIs (e.g., digital engagement rates, adoption rates, churn reduction, customer health scores).</li>\n</ul>\n<ul>\n<li>Provide regular, insightful reporting to leadership and relevant stakeholders on the overall effectiveness, performance, and impact of digital programs within the scaled customer segment.</li>\n</ul>\n<ul>\n<li>Stay current with industry trends, emerging technologies, and best practices in digital CX. Iterate on programs based on direct customer feedback and data-driven insights.</li>\n</ul>\n<p>We&#39;re Looking For Someone Who Has:</p>\n<ul>\n<li>5+ years of experience in Program Management, Customer Success Operations, Digital Success, or a related role, preferably supporting a high-volume, scaled customer segment with hybrid digital/human experience and/or pooled coverage (B2B SaaS experience is a plus).</li>\n</ul>\n<ul>\n<li>Demonstrated experience in building, launching, and scaling digital programs designed to influence customer behavior (adoption, engagement, retention). Proven impact on activation and value adoption (beyond open rates/clicks)</li>\n</ul>\n<ul>\n<li>Strong operational skills, with expertise in process mapping, creating playbooks, and defining automation requirements.</li>\n</ul>\n<ul>\n<li>Proficiency with CRM systems, Marketing Automation platforms, and CS software.</li>\n</ul>\n<ul>\n<li>Excellent analytical skills and a data-driven approach, comfortable using data to tell a story and make recommendations.</li>\n</ul>\n<ul>\n<li>Excellent written and communication skills</li>\n</ul>\n<ul>\n<li>Strong process and project delivery discipline</li>\n</ul>\n<ul>\n<li>Eager to learn new technologies and adapt to evolving customer needs</li>\n</ul>\n<p>We&#39;d Be Extra Excited For Someone Who Has:</p>\n<ul>\n<li>Familiarity with Mixpanel, or a similar analytics tool, including familiarity with analytics implementation methods like SDKs, Customer Data Platforms (CDPs), and Event Streaming.</li>\n</ul>\n<ul>\n<li>Ability to build, script, or configure custom solutions to drive process automation or custom workflow creation.</li>\n</ul>\n<ul>\n<li>Experience writing SQL queries to pull, validate, and analyze customer data directly from a database.</li>\n</ul>\n<ul>\n<li>Experience architecting systems and data flows</li>\n</ul>\n<ul>\n<li>Familiarity with analytics best practices across business segments and verticals</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_f560b1d5-028","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mixpanel","sameAs":"https://mixpanel.com","logo":"https://logos.yubhub.co/mixpanel.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mixpanel/jobs/7568212","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$163,500-$199,500 USD","x-skills-required":["Digital Programs Management","Customer Success Operations","Digital Success","Program Management","Customer Experience","Process Mapping","Automation Requirements","CRM Systems","Marketing Automation Platforms","CS Software","Data-Driven Approach","Analytics","SQL Queries","Database Analysis","System Architecture","Data Flows"],"x-skills-preferred":["Mixpanel","Analytics Tool","SDKs","Customer Data Platforms","Event Streaming","Process Automation","Custom Workflow Creation"],"datePosted":"2026-04-18T15:51:58.795Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City, US (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Operations","industry":"Technology","skills":"Digital Programs Management, Customer Success Operations, Digital Success, Program Management, Customer Experience, Process Mapping, Automation Requirements, CRM Systems, Marketing Automation Platforms, CS Software, Data-Driven Approach, Analytics, SQL Queries, Database Analysis, System Architecture, Data Flows, Mixpanel, Analytics Tool, SDKs, Customer Data Platforms, Event Streaming, Process Automation, Custom Workflow Creation","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":163500,"maxValue":199500,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_015afe59-9fd"},"title":"Data Analyst II","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>\n<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>\n<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Data at Brex</p>\n<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>\n<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>\n<p>What you’ll do</p>\n<p>As a Data Analyst II (DA), you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>\n<p>You will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses.</p>\n<p>This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our New York office.</p>\n<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>\n<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>\n<p>As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities</p>\n<p>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</p>\n<p>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</p>\n<p>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</p>\n<p>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</p>\n<p>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</p>\n<p>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</p>\n<p>Contribute to the automation of recurring analyses and reporting workflows using Python.</p>\n<p>Requirements</p>\n<p>3+ years of experience in data analytics or a related role in a professional setting.</p>\n<p>2+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</p>\n<p>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</p>\n<p>Experience with Python for data analysis, automation, or scripting.</p>\n<p>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</p>\n<p>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</p>\n<p>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</p>\n<p>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</p>\n<p>Bonus points</p>\n<p>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</p>\n<p>Familiarity with dbt for data modeling and transformation.</p>\n<p>Exposure to data pipeline orchestration tools (e.g., Airflow).</p>\n<p>Experience in fintech, financial services, or payments.</p>\n<p>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</p>\n<p>Compensation</p>\n<p>The expected salary range for this role is $93,600 - $117,000.</p>\n<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>\n<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_015afe59-9fd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8463702002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$93,600 - $117,000","x-skills-required":["SQL","Python","Business Intelligence","Data Visualization","Generative AI","LLM-based tools"],"x-skills-preferred":["Cloud data platforms","dbt","Data pipeline orchestration tools","Fintech","Financial services","Payments"],"datePosted":"2026-04-18T15:50:50.572Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, Financial services, Payments","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":93600,"maxValue":117000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_50808499-c0b"},"title":"Senior Customer Solutions Resident Architect","description":"<p>About Us</p>\n<p>dbt Labs is the pioneer of analytics engineering, helping data teams transform raw data into reliable, actionable insights. Since 2016, we’ve grown from an open source project into the leading analytics engineering platform, now used by over 90,000 teams every week, driving data transformations and AI use cases.</p>\n<p>As of February 2025, we’ve surpassed $100 million in annual recurring revenue (ARR) and serve more than 5,400 dbt Platform customers, including AstraZeneca, Sky, Nasdaq, Volvo, JetBlue, and SafetyCulture.</p>\n<p>We’re backed by top-tier investors including Andreessen Horowitz, Sequoia Capital, and Altimeter. At our core, we believe in empowering data practitioners:</p>\n<ul>\n<li>Reliable, high-quality data is the fuel that propels AI-powered data engineering.</li>\n</ul>\n<ul>\n<li>AI is changing data work, fast. dbt’s data control plane keeps data engineers ahead of that curve.</li>\n</ul>\n<ul>\n<li>We empower engineers to deliver reliable, governed data faster, cheaper, and at scale.</li>\n</ul>\n<p>dbt Labs is now synonymous with analytics engineering, defining the modern data stack and serving as the data control plane for enterprise teams around the world. And we’re just getting started..</p>\n<p>We’re growing fast and building a team of passionate, curious people across the globe. Learn more about what makes us special by checking out our values.</p>\n<p><strong>About the Role</strong></p>\n<p>We are seeking an experienced Senior Customer Solutions Resident Architects to join our team. In this role, you will drive critical customer outcomes by delivering high-impact technical guidance to strategic accounts. You will be part of a high-visibility initiative that supports pre-sales, accelerates adoption, enables key migrations, and mitigates churn risks.</p>\n<p>This role is designed to deploy RA-level expertise flexibly, aligning with customer and business needs to drive growth, retention, and expansion.</p>\n<p><strong>What You’ll Do</strong></p>\n<ul>\n<li>Accelerate Customer Success Across the Lifecycle</li>\n</ul>\n<ul>\n<li>Support strategic pre-sales opportunities by providing technical expertise to prospects</li>\n</ul>\n<ul>\n<li>Assist in launching and onboarding new customers who have not purchased RA services, ensuring they successfully adopt dbt Cloud</li>\n</ul>\n<ul>\n<li>Execute proactive adoption plays, including migrations, new feature implementations (e.g., Semantic Layer, Mesh), and major version upgrades</li>\n</ul>\n<ul>\n<li>Lead reactive adoption initiatives to de-risk churn or contraction and position accounts for future growth</li>\n</ul>\n<ul>\n<li>Deliver Technical Excellence</li>\n</ul>\n<ul>\n<li>Advise on architecture, design, implementation, troubleshooting, and best practices in dbt Cloud environments</li>\n</ul>\n<ul>\n<li>Build solution MVPs and guide long-term technical strategies tailored to customer needs</li>\n</ul>\n<ul>\n<li>Engage on multiple projects simultaneously with clear scoping, start and end dates, and outcome tracking</li>\n</ul>\n<ul>\n<li>Collaborate Across Teams</li>\n</ul>\n<ul>\n<li>Partner closely with Customer Solutions Architects (CSAs), Sales, Solutions Architects, Training, and Support</li>\n</ul>\n<ul>\n<li>Provide feedback to Product and Engineering to improve customer experience and prioritize technical needs</li>\n</ul>\n<ul>\n<li>Champion customer success through thoughtful, transparent communication and cross-functional collaboration</li>\n</ul>\n<ul>\n<li>Advance Best Practices and Team Impact</li>\n</ul>\n<ul>\n<li>Help build out and refine this evolving function alongside the broader RA organization</li>\n</ul>\n<ul>\n<li>Track and manage capacity and engagement effectiveness similarly to other RA-led initiatives</li>\n</ul>\n<p><strong>What You’ll Need</strong></p>\n<ul>\n<li>5+ years of experience in technical customer-facing roles such as post-sales consulting, technical architecture, or solution delivery</li>\n</ul>\n<ul>\n<li>Expertise with at least one modern cloud data platform (Snowflake, Databricks, BigQuery, or Redshift)</li>\n</ul>\n<ul>\n<li>Hands-on experience deploying or configuring dbt Cloud, with at least 1 year working with dbt</li>\n</ul>\n<ul>\n<li>Strong proficiency in SQL; working knowledge of Python in analytics contexts preferred</li>\n</ul>\n<ul>\n<li>Comfort leading technical project delivery , managing scope, timelines, and stakeholder expectations across multiple simultaneous engagements</li>\n</ul>\n<ul>\n<li>Clear, concise communication skills for both technical and executive audiences</li>\n</ul>\n<ul>\n<li>A collaborative mindset , thriving in a remote, transparent, and highly cross-functional organization</li>\n</ul>\n<ul>\n<li>Willingness to travel 2–4 times per year for company-wide events</li>\n</ul>\n<p><strong>What Will Make You Stand Out</strong></p>\n<ul>\n<li>dbt Analytics Engineering Certification</li>\n</ul>\n<ul>\n<li>Ability to influence technical direction and build consensus across internal and customer teams</li>\n</ul>\n<ul>\n<li>Experience with traditional enterprise ETL tools (e.g., Informatica, Datastage, Talend) and how they relate to modern data workflows</li>\n</ul>\n<ul>\n<li>Familiarity with strategic sales or renewal processes, including proactive and reactive adoption efforts</li>\n</ul>\n<ul>\n<li>Proven success accelerating usage, adoption, and expansion in large, complex accounts</li>\n</ul>\n<p><strong>Remote Hiring Process</strong></p>\n<ul>\n<li>Interview with a Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Interview with Hiring Manager</li>\n</ul>\n<ul>\n<li>Task</li>\n</ul>\n<ul>\n<li>Task Review</li>\n</ul>\n<ul>\n<li>Final Values Interview</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Unlimited vacation time with a culture that actively encourages time off</li>\n</ul>\n<ul>\n<li>401k plan with 3% guaranteed company contribution</li>\n</ul>\n<ul>\n<li>Comprehensive healthcare coverage</li>\n</ul>\n<ul>\n<li>Generous paid parental leave</li>\n</ul>\n<ul>\n<li>Health &amp; wellness stipend</li>\n</ul>\n<ul>\n<li>Flexible stipends for:</li>\n</ul>\n<ul>\n<li>Home office setup</li>\n</ul>\n<ul>\n<li>Learning and development</li>\n</ul>\n<ul>\n<li>Office space</li>\n</ul>\n<ul>\n<li>And more!</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>We offer competitive compensation packages commensurate with experience, including salary, equity, and where applicable, performance-based pay. Our Talent Acquisition Team can answer questions around dbt Lab’s total rewards during your interview process.</p>\n<p>In Boston, Chicago, Denver, Los Angeles, Philadelphia, New York Metro, San Francisco, DC Metro, Seattle, and Austin, an alternate range may apply, as specified below.</p>\n<ul>\n<li>The typical starting salary range for this role in the specific locations listed is: $163,000 - $200,000</li>\n</ul>\n<ul>\n<li>The typical starting salary range for this role is: $146,000 - $180,000</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_50808499-c0b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4682381005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$146,000 - $180,000","x-skills-required":["modern cloud data platform","dbt Cloud","SQL","Python","technical project delivery","clear, concise communication skills"],"x-skills-preferred":["dbt Analytics Engineering Certification","traditional enterprise ETL tools","strategic sales or renewal processes","proven success accelerating usage, adoption, and expansion"],"datePosted":"2026-04-18T15:50:33.608Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"modern cloud data platform, dbt Cloud, SQL, Python, technical project delivery, clear, concise communication skills, dbt Analytics Engineering Certification, traditional enterprise ETL tools, strategic sales or renewal processes, proven success accelerating usage, adoption, and expansion","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":146000,"maxValue":180000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_37fdd29a-858"},"title":"Principal Product Manager, AI Compliance","description":"<p>We&#39;re looking for a Principal Product Manager to lead our AI compliance efforts. As a key member of our product team, you will define and drive the product strategy that ensures our AI models are developed, deployed, and used in a way that aligns with our values, strategic objectives, and account for relevant obligations.</p>\n<p>Your primary responsibility will be to drive AI compliance and governance product strategy across Pinner- and advertiser-facing AI products. This includes ensuring that model training and inference align with global regulatory obligations, internal policies, and Pinterest&#39;s values.</p>\n<p>You will own user-facing and internal data controls for AI training, setting requirements and roadmaps that balance product performance, privacy, and regulatory needs while ensuring effective internal enforcement and awareness.</p>\n<p>You will also expand and support AI compliance tooling, including detection and labeling systems, governance workflows, and inventories, so teams across Pinterest can build compliant AI experiences efficiently and consistently.</p>\n<p>In addition, you will partner deeply with Legal, Privacy, Policy, Security, and Engineering to translate complex regulatory requirements into clear, prioritized product workstreams and scalable platform capabilities.</p>\n<p>As a Principal Product Manager, you will lead high-visibility, cross-functional programs that span multiple product areas, infrastructure, and engineering teams, creating alignment on tradeoffs, sequencing, and resourcing for AI governance initiatives across the company.</p>\n<p>You will serve as a principal-level thought partner on AI governance, influencing long-range AI product strategy, identifying new risk or compliance gaps early, and advocating for the infrastructure and processes needed to keep Pinterest ahead of regulatory change.</p>\n<p>To be successful in this role, you will need to have deep AI/ML or data platform product experience, including building or scaling products that rely on model training data, user data, or complex backend systems. Experience in AI safety, responsible AI, or AI compliance is a strong plus.</p>\n<p>You will also need to have proven success driving cross-company, multi-year programs at a Staff/Principal level (or equivalent), especially in ambiguous problem spaces that require both strategic frameworks and hands-on execution.</p>\n<p>Additionally, you will need to have experience working closely with Legal, Privacy, Policy, and Security teams to operationalize regulatory or policy requirements into product features, data controls, and platform capabilities.</p>\n<p>Fluency with data and governance concepts is also required, including comfort reasoning about data classification, lineage, retention, opt-outs, and enforcement, and partnering with technical teams to define robust measurement and monitoring.</p>\n<p>Finally, you will need to have exceptional communication and influence skills, with the ability to align executive stakeholders and cross-functional partners around a clear point of view on AI risk, compliance tradeoffs, and sequencing.</p>\n<p>If you are a motivated and experienced product leader who is passionate about AI governance and compliance, we encourage you to apply for this exciting opportunity.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_37fdd29a-858","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Pinterest","sameAs":"https://www.pinterest.com/","logo":"https://logos.yubhub.co/pinterest.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/pinterest/jobs/7494948","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$228,911-$471,286 USD","x-skills-required":["AI/ML","Data Platform","Product Management","Compliance","Governance","Regulatory Affairs","Policy Development","Risk Management","Communication","Influence"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:50:28.613Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA, US; Remote, US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"AI/ML, Data Platform, Product Management, Compliance, Governance, Regulatory Affairs, Policy Development, Risk Management, Communication, Influence","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":228911,"maxValue":471286,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e40d534f-76a"},"title":"Resident Architect","description":"<p>About Us</p>\n<p>dbt Labs is the pioneer of analytics engineering, helping data teams transform raw data into reliable, actionable insights. As of February 2025, we&#39;ve surpassed $100 million in annual recurring revenue (ARR) and serve more than 5,400 dbt Platform customers.</p>\n<p>We&#39;re seeking an experienced Resident Architect (RA) with a passion for solving challenging problems with dbt to join our Professional Services team. RAs are billable to dbt Enterprise customers and help achieve our mission to empower data developers to create and disseminate organisational knowledge.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects - inclusive of implementation, troubleshooting configurations, instilling best practices, and solutioning MVPs and long-term solutions to customer-specific requirements</li>\n</ul>\n<ul>\n<li>Consult on architecture and design</li>\n</ul>\n<ul>\n<li>Ensure our most strategic enterprise customers are adopting the product</li>\n</ul>\n<ul>\n<li>Collaborate with other internal customer-facing teams at dbt Labs - Sales, Solution Architects, Training, Support</li>\n</ul>\n<ul>\n<li>Provide critical feedback to dbt Labs product and engineering teams to improve and prioritise customer requests and ensure rapid resolution for engagement-specific issues</li>\n</ul>\n<ul>\n<li>Become a product expert with dbt in the context of the modern data stack (if you aren&#39;t already)</li>\n</ul>\n<p>What You&#39;ll Need</p>\n<ul>\n<li>4+ years&#39; experience working with technical data tooling, even better if it is in a customer-facing post-sales, technical architect or consulting role</li>\n</ul>\n<ul>\n<li>Deep expertise in at least one data platform (Snowflake, Databricks, BigQuery, Redshift)</li>\n</ul>\n<ul>\n<li>Experience using, deploying, or configuring dbt in an enterprise setting - working with dbt for minimum 1 year</li>\n</ul>\n<ul>\n<li>Proficiency in writing SQL and Python in analytics contexts</li>\n</ul>\n<ul>\n<li>You look forward to building skills in technical areas that support deployment and integration of dbt enterprise solutions to complete customer projects</li>\n</ul>\n<ul>\n<li>Customer focus, embracing one of core values that users are our best advocates</li>\n</ul>\n<ul>\n<li>Strong organisational skills with the ability to manage multiple technical projects simultaneously - including defining scope, tracking timelines, and ensuring deliverables are met</li>\n</ul>\n<ul>\n<li>Clear and concise communicator with the ability to engage internal and external stakeholders, effectively explain complex technical or organisational challenges, and propose thoughtful, iterative solutions</li>\n</ul>\n<ul>\n<li>The ability to thrive in a remote organisation that highly values transparency and cross-collaboration</li>\n</ul>\n<ul>\n<li>Travel approximately 2-4x/year for customer onsite sessions, team offsites, and company events will be expected</li>\n</ul>\n<p>What Will Make You Stand Out</p>\n<ul>\n<li>You have obtained the dbt Analytics Engineering Certification</li>\n</ul>\n<ul>\n<li>You have the ability to advise on dbt enterprise recommendations, and build direction/consensus with the customer to move forward</li>\n</ul>\n<ul>\n<li>Experience with traditional Enterprise ETL tooling (Informatica, Datastage, Talend)</li>\n</ul>\n<p>Remote Hiring Process</p>\n<ul>\n<li>Interview with a Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Hiring Manager Interview</li>\n</ul>\n<ul>\n<li>Technical Task + Presentation</li>\n</ul>\n<ul>\n<li>Team Interview</li>\n</ul>\n<p>Benefits</p>\n<ul>\n<li>Unlimited vacation time with a culture that actively encourages time off</li>\n</ul>\n<ul>\n<li>401k plan with 3% guaranteed company contribution</li>\n</ul>\n<ul>\n<li>Comprehensive healthcare coverage</li>\n</ul>\n<ul>\n<li>Generous paid parental leave</li>\n</ul>\n<ul>\n<li>Flexible stipends for:</li>\n</ul>\n<ul>\n<li>Health &amp; Wellness</li>\n</ul>\n<ul>\n<li>Home Office Setup</li>\n</ul>\n<ul>\n<li>Cell Phone &amp; Internet</li>\n</ul>\n<ul>\n<li>Learning &amp; Development</li>\n</ul>\n<ul>\n<li>Office Space</li>\n</ul>\n<p>Compensation</p>\n<p>We offer competitive compensation packages commensurate with experience, including salary, equity, and where applicable, performance-based pay. Our Talent Acquisition Team can answer questions around dbt Lab&#39;s total rewards during your interview process.</p>\n<p>In select locations (including Boston, Chicago, Denver, Los Angeles, Philadelphia, New York City, San Francisco, Washington, DC, and Seattle), an alternate range may apply, as specified below.</p>\n<ul>\n<li>The typical starting salary range for this role is:</li>\n</ul>\n<p>$114,000 - $137,700</p>\n<ul>\n<li>The typical starting salary range for this role in the select locations listed is:</li>\n</ul>\n<p>$126,000 - $153,000</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e40d534f-76a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4627942005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$114,000 - $137,700","x-skills-required":["dbt","data platform","Snowflake","Databricks","BigQuery","Redshift","SQL","Python","analytics engineering"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:49:56.862Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"dbt, data platform, Snowflake, Databricks, BigQuery, Redshift, SQL, Python, analytics engineering","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":114000,"maxValue":137700,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_85f1f87e-70f"},"title":"Resident Solutions Architect - Financial Services","description":"<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>9+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Capable of design and deployment of highly performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>\n</ul>\n<ul>\n<li>Travel to customers up to 20% of the time</li>\n</ul>\n<p>Nice to have: Databricks Certification</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_85f1f87e-70f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8461327002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":["Databricks Certification"],"datePosted":"2026-04-18T15:49:55.028Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Austin, Texas"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ffd169d9-40b"},"title":"Resident Solutions Architect - Communications, Media, Entertainment & Games","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n</ul>\n<ul>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ffd169d9-40b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8461239002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data science","cloud technology","Apache Spark","CI/CD","MLOps","data platforms & analytics","Python","Scala","AWS","Azure","GCP"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:49:46.649Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Atlanta, Georgia"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, data platforms & analytics, Python, Scala, AWS, Azure, GCP","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_760c3e88-e35"},"title":"Senior Product Manager, Data","description":"<p>Job Title: Senior Product Manager, Data</p>\n<p>We are seeking a Senior Product Manager to support the development of CoreWeave&#39;s Enterprise Data Platform within the CIO organization. This role will contribute to building a scalable, high-performance data lake and data architecture, integrating data from key sources across Operations, Engineering, Sales, Finance, and other IT partners.</p>\n<p>As a Senior Product Manager for Data Infrastructure and Analytics, you will help drive data ingestion, transformation, governance, and analytics enablement. You will collaborate with engineering, analytics, finance, and business teams to help deliver data lake and pipeline orchestration solutions, ensuring accessible data for business insights.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Own and evangelize Data Platform and Business Analytics roadmap and strategy across CoreWeave</li>\n<li>Assist with the execution of CoreWeave&#39;s enterprise data architecture, helping enable the data lake and domain-driven data layer</li>\n<li>Support the development and enhancement of data ingestion, transformation, and orchestration pipelines for scalability, efficiency, and reliability</li>\n<li>Work with the Engineering and Data teams to maintain and enhance data pipelines for both structured and unstructured data, enabling efficient data movement across the organization</li>\n<li>Collaborate with Finance, GTM, Infrastructure, Data Center, and Supply Chain teams to help unify and model data from core systems (ERP, CRM, Asset Mgmt, Supply Chain systems, etc.)</li>\n<li>Contribute to data governance and quality initiatives, focusing on data consistency, lineage tracking, and compliance with security standards</li>\n<li>Support the BI and analytics layer by partnering with stakeholders to enable data products, dashboards, and reporting capabilities</li>\n<li>Help prioritize data-driven initiatives, ensuring alignment with business goals and operational needs in coordination with leadership</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>5+ years of experience in data product management, data architecture, or enterprise data engineering roles</li>\n<li>Familiarity with data lakes, data warehouses, ETL/ELT and streaming pipelines, and data governance frameworks</li>\n<li>Hands-on experience with modern data stack technologies (such as Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka)</li>\n<li>Understanding of data modeling, domain-driven design, and creating scalable data platforms</li>\n<li>Experience supporting the end-to-end data product lifecycle, including requirements gathering and implementation</li>\n<li>Strong collaboration skills with engineering, analytics, and business teams to help deliver data initiatives</li>\n<li>Awareness of data security, compliance, and governance best practices</li>\n<li>Understanding of BI and analytics platforms (such as Tableau, Looker, Power BI) and supporting self-service analytics</li>\n</ul>\n<p>Why CoreWeave?</p>\n<p>At CoreWeave, we work hard, have fun, and move fast! We&#39;re in an exciting stage of hyper-growth that you will not want to miss out on. We&#39;re not afraid of a little chaos, and we&#39;re constantly learning. Our team cares deeply about how we build our product and how we work together, which is represented through our core values:</p>\n<ul>\n<li>Be Curious at Your Core</li>\n<li>Act Like an Owner</li>\n<li>Empower Employees</li>\n<li>Deliver Best-in-Class Client Experiences</li>\n<li>Achieve More Together</li>\n</ul>\n<p>We support and encourage an entrepreneurial outlook and independent thinking. We foster an environment that encourages collaboration and provides the opportunity to develop innovative solutions to complex problems. As we get set for take off, the growth opportunities within the organization are constantly expanding. You will be surrounded by some of the best talent in the industry, who will want to learn from you, too. Come join us!</p>\n<p>Salary Range: $143,000 to $210,000</p>\n<p>Benefits:</p>\n<ul>\n<li>Medical, dental, and vision insurance - 100% paid for by CoreWeave</li>\n<li>Company-paid Life Insurance</li>\n<li>Voluntary supplemental life insurance</li>\n<li>Short and long-term disability insurance</li>\n<li>Flexible Spending Account</li>\n<li>Health Savings Account</li>\n<li>Tuition Reimbursement</li>\n<li>Ability to Participate in Employee Stock Purchase Program (ESPP)</li>\n<li>Mental Wellness Benefits through Spring Health</li>\n<li>Family-Forming support provided by Carrot</li>\n<li>Paid Parental Leave</li>\n<li>Flexible, full-service childcare support with Kinside</li>\n<li>401(k) with a generous employer match</li>\n<li>Flexible PTO</li>\n<li>Catered lunch each day in our office and data center locations</li>\n<li>A casual work environment</li>\n<li>A work culture focused on innovative disruption</li>\n</ul>\n<p>Workplace:</p>\n<p>While we prioritize a hybrid work environment, remote work may be considered for candidates located more than 30 miles from an office, based on role requirements for specialized skill sets. New hires will be invited to attend onboarding at one of our hubs within their first month. Teams also gather quarterly to support collaboration.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_760c3e88-e35","directApply":true,"hiringOrganization":{"@type":"Organization","name":"CoreWeave","sameAs":"https://www.coreweave.com","logo":"https://logos.yubhub.co/coreweave.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coreweave/jobs/4649824006","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$143,000 to $210,000","x-skills-required":["data product management","data architecture","enterprise data engineering","data lakes","data warehouses","ETL/ELT and streaming pipelines","data governance frameworks","modern data stack technologies","Snowflake","BigQuery","Databricks","Apache Spark","Airflow","DBT","Kafka","data modeling","domain-driven design","scalable data platforms","BI and analytics platforms","Tableau","Looker","Power BI"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:48:58.405Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Livingston, NJ / New York, NY / Sunnyvale, CA / Bellevue, WA/San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data product management, data architecture, enterprise data engineering, data lakes, data warehouses, ETL/ELT and streaming pipelines, data governance frameworks, modern data stack technologies, Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka, data modeling, domain-driven design, scalable data platforms, BI and analytics platforms, Tableau, Looker, Power BI","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":143000,"maxValue":210000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3168d7d3-70b"},"title":"Partner Solutions Architect - North America","description":"<p>About Us</p>\n<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across North America. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>As a Partner Solutions Architect, you will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud. Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Partner closely with North America Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>Benefits</p>\n<ul>\n<li>Unlimited vacation (and yes we use it!)</li>\n<li>Pension coverage</li>\n<li>Excellent healthcare</li>\n<li>Paid Parental Leave</li>\n<li>Wellness stipend</li>\n<li>Home office stipend, and more!</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3168d7d3-70b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673630005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner development","field engineering","sales engineering","consulting","partner engineering"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:48:30.813Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Canada - Remote; US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner development, field engineering, sales engineering, consulting, partner engineering, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_26f523c0-bbd"},"title":"Resident Solutions Architect - Manufacturing","description":"<p>As a Resident Solutions Architect (RSA) on our Professional Services team, you will work with customers on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues</li>\n</ul>\n<ul>\n<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>\n</ul>\n<ul>\n<li>Ability to travel up to 30% when needed</li>\n</ul>\n<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_26f523c0-bbd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8494154002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems","Apache Spark","CI/CD","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:48:21.946Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Boston, Massachusetts"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2c0fc802-cf3"},"title":"Staff Product Manager, AI Platform","description":"<p>At Databricks, we are building the world&#39;s best data and AI infrastructure platform. As a Staff Product Manager on the AI Platform team, you will drive the vision and roadmap for AI platform product areas and define how customers build, train, deploy, and monitor AI and ML systems on Databricks.</p>\n<p>You will own the product roadmap for AI platform areas, defining what we build, why, and in what order, to accelerate customer adoption of AI and ML in production. You will drive strategy for key AI platform capabilities, shaping how enterprises operationalize AI at scale.</p>\n<p>You will partner closely with engineering teams to make deeply technical decisions about ML infrastructure, from distributed training architectures to real-time serving systems. You will represent the voice of the customer by engaging directly with enterprise ML teams, translating their pain points and workflows into platform capabilities that simplify the path to production AI.</p>\n<p>You will collaborate with GTM, Solutions Architecture, and Customer Success teams to drive enterprise adoption, shape field enablement, and inform competitive positioning. You will define pricing, packaging, and commercialization strategy for AI platform features, working with business teams to maximize value capture.</p>\n<p>You will grow end-user engagement with Databricks AI tools by identifying adoption bottlenecks and partnering cross-functionally to remove them.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2c0fc802-cf3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8420609002","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$181,700-$249,800 USD","x-skills-required":["Deep technical background in computer science, electrical engineering, or equivalent degree","Experience with ML/AI infrastructure, data platforms, or cloud services","Proven enterprise B2B product management experience with highly technical customers","Ability to engage credibly with world-class ML engineers","Familiarity with recommendation systems"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:48:11.374Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Deep technical background in computer science, electrical engineering, or equivalent degree, Experience with ML/AI infrastructure, data platforms, or cloud services, Proven enterprise B2B product management experience with highly technical customers, Ability to engage credibly with world-class ML engineers, Familiarity with recommendation systems","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":181700,"maxValue":249800,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0ff568ca-d59"},"title":"Senior Software Engineer - Data Infrastructure Services","description":"<p>CoreWeave is seeking a senior software engineer to join its Data Platforms Team. The ideal candidate will have experience in database and stream processing, and will be responsible for designing and implementing the platform to deliver data to teams with a focus on providing managed solutions through APIs.</p>\n<p>The successful candidate will participate in operations and scaling of relational data platforms, develop a stream processing architecture, and improve the performance, security, reliability, and scalability of our data platforms and related services. They will also establish guidelines, guardrails for data access and storage for stakeholder teams, and ensure compliance with standards for data protection regulation.</p>\n<p>In addition to technical skills, the ideal candidate will be able to grow, change, invest in their teammates, be invested-in, share their ideas, listen to others, be curious, have fun, and be themselves. CoreWeave values diversity and inclusion, and encourages candidates from all backgrounds to apply.</p>\n<p>Key responsibilities:</p>\n<ul>\n<li>Design and implement the platform to deliver data to teams with a focus on providing managed solutions through APIs</li>\n</ul>\n<ul>\n<li>Participate in operations and scaling of relational data platforms</li>\n</ul>\n<ul>\n<li>Develop a stream processing architecture</li>\n</ul>\n<ul>\n<li>Improve the performance, security, reliability, and scalability of our data platforms and related services</li>\n</ul>\n<ul>\n<li>Establish guidelines, guardrails for data access and storage for stakeholder teams</li>\n</ul>\n<ul>\n<li>Ensure compliance with standards for data protection regulation</li>\n</ul>\n<ul>\n<li>Grow, change, invest in your teammates, be invested-in, share your ideas, listen to others, be curious, have fun, and be yourself</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>5+ years of experience in a software or infrastructure engineering industry</li>\n</ul>\n<ul>\n<li>Experience operating services in production and at scale</li>\n</ul>\n<ul>\n<li>Familiarity with one of the distributed NewSQL datastores such as CockroachDB, TiDB, YDB, Yugabyte and/or stream processing tools such as NATS or Kafka</li>\n</ul>\n<ul>\n<li>Experience with designing and operating these systems at scale</li>\n</ul>\n<ul>\n<li>Familiarity with Kubernetes and have interest or comfortable with using it for event-driven and/or stateful orchestration</li>\n</ul>\n<ul>\n<li>Proficiency in Go/Python/Java and interested in contributing to open source</li>\n</ul>\n<p>ExperienceLevel: senior</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0ff568ca-d59","directApply":true,"hiringOrganization":{"@type":"Organization","name":"CoreWeave","sameAs":"https://www.coreweave.com","logo":"https://logos.yubhub.co/coreweave.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coreweave/jobs/4671479006","x-work-arrangement":"hybrid","x-experience-level":null,"x-job-type":"full-time","x-salary-range":"$165,000 to $242,000","x-skills-required":["database and stream processing","API design and implementation","operational and scaling of relational data platforms","stream processing architecture","performance, security, reliability, and scalability of data platforms","data access and storage guidelines","data protection regulation compliance"],"x-skills-preferred":["Kubernetes","Go/Python/Java","open source contribution"],"datePosted":"2026-04-18T15:48:10.919Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Sunnyvale, CA / Bellevue, WA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"database and stream processing, API design and implementation, operational and scaling of relational data platforms, stream processing architecture, performance, security, reliability, and scalability of data platforms, data access and storage guidelines, data protection regulation compliance, Kubernetes, Go/Python/Java, open source contribution","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":165000,"maxValue":242000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3d57b93e-423"},"title":"Resident Solutions Architect - Financial Services","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n<li>Provide an escalated level of support for customer operational issues.</li>\n<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Capable of design and deployment of highly performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n<li>Documentation and white-boarding skills.</li>\n<li>Experience working with clients and managing conflicts.</li>\n<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>\n<li>Travel to customers up to 20% of the time</li>\n</ul>\n<p>Nice to have:</p>\n<ul>\n<li>Databricks Certification</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3d57b93e-423","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8456948002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","data architecture","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":["Databricks Certification"],"datePosted":"2026-04-18T15:47:22.867Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Atlanta, Georgia"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, data architecture, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c1a6c403-cf8"},"title":"Strategic Account Executive, UAE","description":"<p>Want to help solve the world&#39;s toughest problems with data and AI?</p>\n<p>This is what we do every day at Databricks.</p>\n<p>We are looking for an Account Executive to manage our most strategic customer in the UAE. The role will be based in London and will require you to travel to the UAE on a regular basis.</p>\n<p>As a Strategic Account Executive, you are a sales professional experienced in selling to large Enterprise accounts. You know how to sell innovation and change through customer vision expansion and can guide deals forward to compress decision cycles.</p>\n<p>Key responsibilities:</p>\n<ul>\n<li>Assess your territory and develop a successful execution strategy</li>\n<li>Exceed activity and quarterly revenue targets</li>\n<li>Track all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce</li>\n<li>Identify new use case opportunities and showcase value to existing customers</li>\n<li>Promote the value of the Databricks&#39; Data Intelligence Platform</li>\n<li>Orchestrate and utilise our field engineering teams to ensure valuable outcomes for clients</li>\n<li>Build and demonstrate value with all engagements to guide successful negotiations to close point</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>Extensive understanding of the data platform, open source and cloud ecosystems</li>\n<li>Highly skilled in prospecting research and ability to map out key stakeholders</li>\n<li>Demonstrated success in Value Selling and developing a mutual action plan</li>\n<li>Ability to influence decision-making and strategy with customer leadership teams</li>\n<li>Ability to establish credibility with the C-suite</li>\n<li>Adept in selling to technical buyers</li>\n<li>Mastery of MEDDPICC</li>\n<li>Bachelor&#39;s Degree or relevant work experience</li>\n<li>Fluency in English is required, fluency in Arabic is preferred</li>\n</ul>\n<p>Benefits:</p>\n<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c1a6c403-cf8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8452487002","x-work-arrangement":"hybrid","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data platform","open source","cloud ecosystems","prospecting research","value selling","MEDDPICC"],"x-skills-preferred":["Arabic"],"datePosted":"2026-04-18T15:47:03.186Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, United Kingdom"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"data platform, open source, cloud ecosystems, prospecting research, value selling, MEDDPICC, Arabic"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c7ba4251-36b"},"title":"Resident Solutions Architect - Public Sector","description":"<p>Job Title: Resident Solutions Architect - Public Sector</p>\n<p>We are seeking a highly skilled Resident Solutions Architect to join our Professional Services team in Washington, D.C. As a Resident Solutions Architect, you will work with customers on short to medium-term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>\n<li>Provide an escalated level of support for customer operational issues</li>\n<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>US Top Secret Clearance Required this position</li>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Design and deployment of performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines</li>\n<li>Documentation and white-boarding skills</li>\n<li>Experience working with clients and managing conflicts</li>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>\n<li>Bachelor&#39;s degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience</li>\n<li>Ability to travel up to 30% when needed</li>\n</ul>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD Zone 2 Pay Range $180,656-$248,360 USD Zone 3 Pay Range $180,656-$248,360 USD Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p>About Databricks</p>\n<p>Databricks is the data and AI company. More than 10,000 organizations worldwide , including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 , rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.</p>\n<p>Benefits</p>\n<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.</p>\n<p>Our Commitment to Diversity and Inclusion</p>\n<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.</p>\n<p>Compliance</p>\n<p>If access to export-controlled technology or source code is required for performance of job duties, it is within Employer&#39;s discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis aloneabled</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c7ba4251-36b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8356289002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems","Apache Spark","CI/CD","MLOps","end-to-end data architectures","technical project delivery","scope and timelines","documentation and white-boarding","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:53.995Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, D.C."}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, scope and timelines, documentation and white-boarding, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d858a729-7fc"},"title":"Senior Manager, Finance and Accounting Systems","description":"<p>You will lead the Finance Systems team responsible for architecting, scaling, and supporting enterprise platforms across GL, Revenue, Billing, Fixed Assets, Inventory, Lease Accounting. You&#39;ll partner closely with IT Product management, Finance and accounting, RevOps, Integration and Data Platforms to drive system implementation, support, compliance, automation, and integration.</p>\n<p>In this role, you will own end-to-end Finance &amp; accounting system architecture spanning key business processes around GL, AP, AR, ARM, Fixed Assets, Leases and inventory accounting. You will act as the primary systems partner for Finance stakeholders, including Revenue Accounting, Operations Accounting, Tax, Fixed Assets, Treasury, GL and AP teams, while working with IT Product Management on executing the roadmap across modules aligned with business priorities.</p>\n<p>You will lead a team of BSEs and administrators to deliver scalable solutions aligned to business goals, implement and execute audit ready ITGC &amp; ITAC controls for SOX Compliance, and partner with IT integration teams on cross-functional workflows and AI agents for process automation and optimization. You will also provide leadership for Finance systems admin support, GL, and operational excellence.</p>\n<p>The base salary range for this role is $207,000 to $250,000. The starting salary will be determined based on job-related knowledge, skills, experience, and market location. We strive for both market alignment and internal equity when determining compensation. In addition to base salary, our total rewards package includes a discretionary bonus, equity awards, and a comprehensive benefits program (all based on eligibility).</p>\n<p>The range we’ve posted represents the typical compensation range for this role. To determine actual compensation, we review the market rate for each candidate which can include a variety of factors. These include qualifications, experience, interview performance, and location.</p>\n<p>In addition to a competitive salary, we offer a variety of benefits to support your needs, including:</p>\n<ul>\n<li>Medical, dental, and vision insurance - 100% paid for by CoreWeave</li>\n</ul>\n<ul>\n<li>Company-paid Life Insurance</li>\n</ul>\n<ul>\n<li>Voluntary supplemental life insurance</li>\n</ul>\n<ul>\n<li>Short and long-term disability insurance</li>\n</ul>\n<ul>\n<li>Flexible Spending Account</li>\n</ul>\n<ul>\n<li>Health Savings Account</li>\n</ul>\n<ul>\n<li>Tuition Reimbursement</li>\n</ul>\n<ul>\n<li>Ability to Participate in Employee Stock Purchase Program (ESPP)</li>\n</ul>\n<ul>\n<li>Mental Wellness Benefits through Spring Health</li>\n</ul>\n<ul>\n<li>Family-Forming support provided by Carrot</li>\n</ul>\n<ul>\n<li>Paid Parental Leave</li>\n</ul>\n<ul>\n<li>Flexible, full-service childcare support with Kinside</li>\n</ul>\n<ul>\n<li>401(k) with a generous employer match</li>\n</ul>\n<ul>\n<li>Flexible PTO</li>\n</ul>\n<ul>\n<li>Catered lunch each day in our office and data center locations</li>\n</ul>\n<ul>\n<li>A casual work environment</li>\n</ul>\n<ul>\n<li>A work culture focused on innovative disruption</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d858a729-7fc","directApply":true,"hiringOrganization":{"@type":"Organization","name":"CoreWeave","sameAs":"https://www.coreweave.com","logo":"https://logos.yubhub.co/coreweave.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coreweave/jobs/4655657006","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$207,000 to $250,000","x-skills-required":["ERP","finance systems leadership","technical teams management","finance processes","RevRec","GL","Fixed Assets","Tax","Inventory accounting management","Lease Accounting","treasury and Payments","Netsuite OneWorld","Advanced Financials","ARM","Multibook","Bank Payments","Squareworks AP Automation","Netgain - Netloan/NetAssets","Salesforce","Coupa","Kyriba","data platforms","SOX controls","auditors"],"x-skills-preferred":["AI use for addressing use cases in Finance & accounting","Ramp","Navan","Costar","FloQast","Auditboard","Workiva"],"datePosted":"2026-04-18T15:46:43.861Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Sunnyvale, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"ERP, finance systems leadership, technical teams management, finance processes, RevRec, GL, Fixed Assets, Tax, Inventory accounting management, Lease Accounting, treasury and Payments, Netsuite OneWorld, Advanced Financials, ARM, Multibook, Bank Payments, Squareworks AP Automation, Netgain - Netloan/NetAssets, Salesforce, Coupa, Kyriba, data platforms, SOX controls, auditors, AI use for addressing use cases in Finance & accounting, Ramp, Navan, Costar, FloQast, Auditboard, Workiva","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":207000,"maxValue":250000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e7613e05-073"},"title":"Customer Enablement Specialist","description":"<p>Job Title: Customer Enablement Specialist</p>\n<p>Location: Bellevue, Washington</p>\n<p>Department: Education &amp; Training</p>\n<p>CSQ227R234</p>\n<p><strong>About the Role</strong></p>\n<p>This role is required to work in a hybrid office setting in our Bellevue, WA office.</p>\n<p><strong>The Opportunity</strong></p>\n<p>Databricks runs some of the largest customer enablement programs in the industry , workshops, digital courses, labs, and webinars that reach thousands of users. The Customer Enablement Specialist turns that reach into results. You connect engaged learners to structured training plans that drive product adoption, customer success, and measurable business impact.</p>\n<p>This isn’t a sales or business development role , every conversation begins with an existing Databricks user or program participant. Your focus is on helping those customers move from initial interest to tangible capability: skilled teams, completed training milestones, and activated use cases.</p>\n<p>You’ll manage a broad portfolio of accounts, supporting new and emerging personas , business users, analysts, and app developers , and helping them succeed with Databricks’ latest innovations in AI/BI, Databricks Apps, and agent-based development.</p>\n<p><strong>What You&#39;ll Do</strong></p>\n<ul>\n<li>Convert participation in Databricks’ scale programs (webinars, workshops, digital learning) into structured training engagements.</li>\n</ul>\n<ul>\n<li>Own a high-volume enablement pipeline , identifying learner needs, recommending tailored paths, and tracking adoption progress.</li>\n</ul>\n<ul>\n<li>Deliver engaging L100–L200 sessions and demos to help new personas understand what’s possible with Databricks.</li>\n</ul>\n<ul>\n<li>Build enablement plans for each account, tracking trained users, completion rates, and milestone achievement.</li>\n</ul>\n<ul>\n<li>Partner with Customer Success Managers (CSMs), Account Executives (AEs), and senior CEAs to align training with customer goals and renewal cycles.</li>\n</ul>\n<ul>\n<li>Report key metrics , trained accounts, learner growth, conversion rates, and training revenue , using data to guide your priorities.</li>\n</ul>\n<ul>\n<li>Provide structured feedback to program and curriculum teams to sharpen future customer learning experiences.</li>\n</ul>\n<p><strong>What You Bring</strong></p>\n<ul>\n<li>2–4 years in a technical, customer-facing role , technical training, pre-sales, enablement, or customer success preferred.</li>\n</ul>\n<ul>\n<li>Hands-on familiarity with modern data and analytics platforms (Databricks, cloud SQL, BI tools, or data lakes).</li>\n</ul>\n<ul>\n<li>Confidence delivering introductory technical content to non-expert audiences.</li>\n</ul>\n<ul>\n<li>Working knowledge of AI/ML concepts , able to explain how Databricks enables practical use cases.</li>\n</ul>\n<ul>\n<li>Strong communication skills and a consultative approach: discover needs, recommend paths, and gain commitment.</li>\n</ul>\n<ul>\n<li>A data-driven mindset with strong organisational habits and comfort managing many concurrent accounts.</li>\n</ul>\n<ul>\n<li>Team-first attitude , proactive collaborator who knows when to escalate for deeper technical support.</li>\n</ul>\n<p><strong>Bonus Points</strong></p>\n<ul>\n<li>Databricks certifications or willingness to certify (Data Engineer Associate, Databricks certifications (or willingness to obtain within 6 months).</li>\n</ul>\n<ul>\n<li>Background in SaaS, cloud, or data platforms; familiarity with BI or AI/BI tools (Databricks Genie, Tableau, Power BI).</li>\n</ul>\n<ul>\n<li>Exposure to Databricks Apps, REST APIs, or AI agent concepts.</li>\n</ul>\n<ul>\n<li>Experience in a role with enablement or training-related revenue metrics.</li>\n</ul>\n<p><strong>Why This Role, Why Now</strong></p>\n<p>New products create new skill gaps. As Databricks expands into AI/BI, Databricks Apps, and agent-based development, a new wave of users , business analysts, app builders, domain experts , needs to get skilled up quickly. The depth CEA team focuses on the complex, strategic, and deeply technical. This role focuses on the broad middle: high volume, new personas, and the scale-to-commitment motion that turns digital participation into real adoption. It is a high-visibility, high-impact position with a clear growth path into senior CEA work as you build depth and track record.</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipates utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 2 Pay Range $86,600-$119,150 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e7613e05-073","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8431935002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$86,600-$119,150 USD","x-skills-required":["data and analytics platforms","cloud SQL","BI tools","data lakes","AI/ML concepts","Databricks Apps","REST APIs","AI agent concepts"],"x-skills-preferred":["Databricks certifications","SaaS","cloud","data platforms","BI or AI/BI tools"],"datePosted":"2026-04-18T15:46:34.416Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bellevue, Washington"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data and analytics platforms, cloud SQL, BI tools, data lakes, AI/ML concepts, Databricks Apps, REST APIs, AI agent concepts, Databricks certifications, SaaS, cloud, data platforms, BI or AI/BI tools","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":86600,"maxValue":119150,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_cbd81d47-d7e"},"title":"Data Platform Solutions Architect (Professional Services)","description":"<p>We&#39;re hiring for multiple roles within our Professional Services team. This position may be offered as Senior Solutions Consultant, Resident Solutions Architect, or Senior Resident Solutions Architect. The final title will align to your experience, technical depth, and customer-facing ownership.</p>\n<p>As a Big Data Solutions Architect (Internal Title - Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n<li>Provide an escalated level of support for customer operational issues.</li>\n<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>Extensive experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Design and deployment of performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n<li>Documentation and white-boarding skills.</li>\n<li>Experience working with clients and managing conflicts.</li>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n<li>Travel to customers 10% of the time</li>\n</ul>\n<p>[Preferred] Databricks Certification but not essential</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_cbd81d47-d7e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8486738002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":["Databricks Certification"],"datePosted":"2026-04-18T15:46:17.349Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, United Kingdom"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8efd6b3b-251"},"title":"Resident Solutions Architect - Public Sector","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n</ul>\n<ul>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Databricks Certification</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8efd6b3b-251","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8456973002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","performant end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:45:55.475Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Boston, Massachusetts"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6d94d7ea-9ca"},"title":"Resident Solutions Architect - Financial Services","description":"<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>9+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Capable of design and deployment of highly performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>\n</ul>\n<ul>\n<li>Travel to customers up to 20% of the time</li>\n</ul>\n<p>Nice to have: Databricks Certification</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6d94d7ea-9ca","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8461330002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","design and deployment of highly performant end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":["Databricks Certification"],"datePosted":"2026-04-18T15:45:27.183Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, D.C."}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, design and deployment of highly performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ab6e1e39-16f"},"title":"Core Account Executive, Nordics (Digital Natives)","description":"<p>Want to help solve the world&#39;s toughest problems with data and AI?</p>\n<p>This is what we do every day at Databricks.</p>\n<p>As a Core Account Executive (Digital Natives) at Databricks, you will own a focused territory of ~10 large Digital Native spending accounts across the Nordics.</p>\n<p>You will assess your territory and develop a successful execution strategy, drive quarter-on-quarter consumption growth in existing accounts, exceed activity and quarterly revenue targets, track all customer details, identify new use case opportunities, promote the value of the Databricks&#39; Data Intelligence Platform and other products, and ensure 100% satisfaction among all customers.</p>\n<p>We look for good understanding of the data platform and cloud ecosystems, some exposure to the software industry and understanding of selling SaaS, Data and Business Value, experience growing consumption and closing commit deals in a direct sales role, competent with prospecting research and ability to map out key stakeholders, advanced understanding of MEDDPICC, experience exceeding sales targets/quotas, and a Bachelor&#39;s Degree or relevant work experience.</p>\n<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit our website.</p>\n<p>Our Commitment to Diversity and Inclusion</p>\n<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ab6e1e39-16f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8460871002","x-work-arrangement":"onsite","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data platform","cloud ecosystems","software industry","SaaS","Data and Business Value","prospecting research","MEDDPICC","sales targets/quotas"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:45:06.124Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Stockholm, Sweden"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"data platform, cloud ecosystems, software industry, SaaS, Data and Business Value, prospecting research, MEDDPICC, sales targets/quotas"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3d22e39a-bde"},"title":"Data Analyst II","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>\n<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>\n<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Data at Brex</p>\n<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>\n<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>\n<p>What you’ll do</p>\n<p>As a Data Analyst II (DA), you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>\n<p>You will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses.</p>\n<p>This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our San Francisco office.</p>\n<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>\n<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>\n<p>As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities</p>\n<p>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</p>\n<p>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</p>\n<p>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</p>\n<p>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</p>\n<p>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</p>\n<p>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</p>\n<p>Contribute to the automation of recurring analyses and reporting workflows using Python.</p>\n<p>Requirements</p>\n<p>3+ years of experience in data analytics or a related role in a professional setting.</p>\n<p>2+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</p>\n<p>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</p>\n<p>Experience with Python for data analysis, automation, or scripting.</p>\n<p>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</p>\n<p>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</p>\n<p>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</p>\n<p>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</p>\n<p>Bonus points</p>\n<p>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</p>\n<p>Familiarity with dbt for data modeling and transformation.</p>\n<p>Exposure to data pipeline orchestration tools (e.g., Airflow).</p>\n<p>Experience in fintech, financial services, or payments.</p>\n<p>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</p>\n<p>Compensation</p>\n<p>The expected salary range for this role is $93,600 - $117,000.</p>\n<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>\n<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3d22e39a-bde","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8463696002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$93,600 - $117,000","x-skills-required":["SQL","Python","Business Intelligence","Data Visualization","Generative AI","LLM-based tools"],"x-skills-preferred":["Cloud data platforms","dbt","Data pipeline orchestration tools","Fintech","Financial services","Payments"],"datePosted":"2026-04-18T15:44:50.317Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, Financial services, Payments","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":93600,"maxValue":117000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8aca0e87-abb"},"title":"Strategic AI/BI Account Executive","description":"<p>We are seeking a Strategic AI/BI Account Executive to help enterprise customers transform how business users interact with data. This high-impact role sits within the AI Go-To-Market team and partners closely with Enterprise Account Executives to drive adoption of Databricks AI/BI and Genie.</p>\n<p>You will help organizations move beyond static dashboards to governed, conversational, AI-powered analytics at the center of the convergence of business intelligence, data platforms, and generative AI.</p>\n<p>Enterprise analytics is rapidly evolving from dashboards and static reporting to conversational, AI-driven decision platforms. Databricks AI/BI and Genie empower business users to securely interact with governed data using natural language, transforming the data platform into a true decision platform.</p>\n<p>If you want to be at the forefront of AI-powered analytics transformation at one of the fastest-growing data and AI companies in the world, this is your opportunity.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Partner with Enterprise AEs to identify, qualify, and close AI/BI opportunities</li>\n<li>Engage C-level, analytics, and line-of-business leaders to modernize analytics strategies</li>\n<li>Displace or expand legacy BI platforms with AI-powered, governed analytics solutions</li>\n<li>Lead conversations around semantic governance, self-service analytics, and natural language data access</li>\n<li>Drive proof-of-value engagements and scale enterprise-wide adoption</li>\n<li>Align AI/BI initiatives to measurable business outcomes (productivity, speed to insight, revenue impact)</li>\n<li>Enable field teams and serve as a subject matter expert on modern analytics architectures</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>Enterprise sales experience in BI, analytics, data platforms, or AI/ML</li>\n<li>Strong understanding of modern analytics architectures and data governance</li>\n<li>Ability to sell to both technical and business stakeholders</li>\n<li>Executive presence and experience navigating complex buying cycles</li>\n<li>Passion for AI and the impact of GenAI on enterprise analytics</li>\n<li>Experience operating in a specialist or overlay sales model</li>\n<li>Ability to translate technical capabilities into clear business value</li>\n<li>7+ years of Enterprise Sales experience, exceeding quotas in larger accounts</li>\n<li>Bachelors Degree or equivalent experience</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8aca0e87-abb","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8441888002","x-work-arrangement":"onsite","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Enterprise sales experience in BI, analytics, data platforms, or AI/ML","Strong understanding of modern analytics architectures and data governance","Ability to sell to both technical and business stakeholders","Executive presence and experience navigating complex buying cycles","Passion for AI and the impact of GenAI on enterprise analytics"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:44:42.641Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, United Kingdom"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Enterprise sales experience in BI, analytics, data platforms, or AI/ML, Strong understanding of modern analytics architectures and data governance, Ability to sell to both technical and business stakeholders, Executive presence and experience navigating complex buying cycles, Passion for AI and the impact of GenAI on enterprise analytics"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_43ca4523-d1d"},"title":"Customer Enablement Specialist","description":"<p>As a Customer Enablement Specialist at Databricks, you will play a critical role in helping customers succeed with our data and AI platform. You will be responsible for managing a broad portfolio of accounts, supporting new and emerging personas, and helping them succeed with Databricks&#39; latest innovations in AI/BI, Databricks Apps, and agent-based development.</p>\n<p>Your main objective will be to convert participation in Databricks&#39; scale programs (webinars, workshops, digital learning) into structured training engagements. You will own a high-volume enablement pipeline, identifying learner needs, recommending tailored paths, and tracking adoption progress.</p>\n<p>To achieve this, you will deliver engaging L100–L200 sessions and demos to help new personas understand what&#39;s possible with Databricks. You will build enablement plans for each account, tracking trained users, completion rates, and milestone achievement.</p>\n<p>You will partner with Customer Success Managers (CSMs), Account Executives (AEs), and senior CEAs to align training with customer goals and renewal cycles. You will report key metrics – trained accounts, learner growth, conversion rates, and training revenue – using data to guide your priorities.</p>\n<p>In addition, you will provide structured feedback to program and curriculum teams to sharpen future customer learning experiences.</p>\n<p>We are looking for a highly motivated and organized individual with excellent communication skills and a consultative approach. You should have hands-on familiarity with modern data and analytics platforms, confidence delivering introductory technical content to non-expert audiences, and a working knowledge of AI/ML concepts.</p>\n<p>If you are passionate about helping customers succeed and have a strong desire to learn and grow with our company, we encourage you to apply for this exciting opportunity.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_43ca4523-d1d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8431927002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$86,600-$119,150 USD","x-skills-required":["modern data and analytics platforms","customer-facing role","technical training","pre-sales","enablement","customer success","AI/BI","Databricks Apps","agent-based development"],"x-skills-preferred":["Databricks certifications","SaaS","cloud","data platforms","BI or AI/BI tools","Databricks Genie","Tableau","Power BI"],"datePosted":"2026-04-18T15:44:34.379Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bellevue, Washington"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"modern data and analytics platforms, customer-facing role, technical training, pre-sales, enablement, customer success, AI/BI, Databricks Apps, agent-based development, Databricks certifications, SaaS, cloud, data platforms, BI or AI/BI tools, Databricks Genie, Tableau, Power BI","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":86600,"maxValue":119150,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_fd64db3e-49f"},"title":"Staff Software Engineer – Customer Experience Intelligence (CXI)","description":"<p>At Databricks, we&#39;re shaping the future of how customers experience support at scale. As the Staff Technical Lead for Customer Experience Intelligence, you&#39;ll design intelligent, AI-powered systems that make support faster, smarter, and more effortless.</p>\n<p>In this role, you&#39;ll have end-to-end ownership of the architecture and technical strategy behind automation and agentic workflows that reduce mean time to mitigate (MTTM), boost quality, and enable our Support organization to scale impact without scaling headcount. You&#39;ll work hands-on with teams across Support, Product, and Platform Engineering to build seamless systems that anticipate customer needs before they arise.</p>\n<p>You&#39;ll lead the technical foundation that transforms how customers experience support , where issues are auto-diagnosed, solutions are delivered instantly, and engineers focus their time on the toughest challenges. Your success will mean customers moving faster, trusting Databricks deeper, and feeling the impact of your systems every day.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Owning the technical vision and architecture for Databricks&#39; Support Automation and Tooling ecosystem</li>\n<li>Leading hands-on development of automation to improve customer experience and Support scalability</li>\n<li>Driving rapid, iterative development while upholding quality, safety, and reliability standards</li>\n<li>Designing agentic workflows that evolve from human-in-the-loop to fully automated systems</li>\n<li>Implementing observability, transparency, and rollback mechanisms for AI-driven decisions</li>\n<li>Acting as the primary technical interface between Support, Product, and Platform Engineering to align technical roadmaps and unblock dependencies</li>\n<li>Setting a high engineering bar for quality, reliability, and maintainability in line with Databricks standards</li>\n<li>Mentoring engineers and SMEs across Software and Support Engineering functions</li>\n</ul>\n<p>We&#39;re looking for someone with:</p>\n<ul>\n<li>A BS or higher degree in Computer Science or a related field</li>\n<li>Technical leadership experience in large projects similar to those described, including automation tooling, distributed systems, and APIs</li>\n<li>Extensive full-stack development experience</li>\n<li>Proven success designing and deploying production-grade automation in complex technical environments</li>\n<li>Hands-on experience with ML-assisted systems, decision support, or agentic automation</li>\n<li>Deep familiarity with distributed data platforms, developer tooling, and large-scale infrastructure systems</li>\n<li>Understanding of multi-cloud environments (AWS, Azure, GCP), compliance, and security constraints</li>\n</ul>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range for this role is $190,000-$261,250 USD.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_fd64db3e-49f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8416959002","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$190,000-$261,250 USD","x-skills-required":["Automation tooling","Distributed systems","APIs","Full-stack development","ML-assisted systems","Decision support","Agentic automation","Distributed data platforms","Developer tooling","Large-scale infrastructure systems","Multi-cloud environments","Compliance","Security constraints"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:44:19.005Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Mountain View, California; San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Automation tooling, Distributed systems, APIs, Full-stack development, ML-assisted systems, Decision support, Agentic automation, Distributed data platforms, Developer tooling, Large-scale infrastructure systems, Multi-cloud environments, Compliance, Security constraints","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":190000,"maxValue":261250,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b96ce52c-eaa"},"title":"Engineering Manager, Onboarding","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>\n<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p><strong>Engineering</strong></p>\n<p>Engineering at Brex is about building systems that scale with speed and intention. Our teams span Software, Data, Security, and IT, and operate with high autonomy and deep collaboration. We tackle hard technical problems, own our outcomes, and push for excellence at every level , from architecture to deployment.</p>\n<p>It’s an environment where engineering is a craft, and builders become leaders.</p>\n<p><strong>What you’ll do</strong></p>\n<p>You will lead an engineering team focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, and integrations workflows that help customers realize value quickly.</p>\n<p>This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>\n<p>The ideal candidate is an engineering leader with experience scaling user-facing onboarding systems, delivering high-quality product experiences, and partnering deeply across Product, Design, Operations, and GTM teams.</p>\n<p><strong>Where you’ll work</strong></p>\n<p>This role will be based in our San Francisco office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday.</p>\n<p>As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding experience.</li>\n</ul>\n<ul>\n<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>\n</ul>\n<ul>\n<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>\n</ul>\n<ul>\n<li>Drive execution of the Onboarding roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>\n</ul>\n<ul>\n<li>Lead and manage a team of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>\n</ul>\n<ul>\n<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>\n</ul>\n<ul>\n<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>\n</ul>\n<ul>\n<li>6+ years of software engineering experience with strong technical depth.</li>\n</ul>\n<ul>\n<li>3+ years of experience managing or leading engineers in a high-growth environment.</li>\n</ul>\n<ul>\n<li>Strong technical background and understanding of software development principles.</li>\n</ul>\n<ul>\n<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>\n</ul>\n<ul>\n<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>\n</ul>\n<ul>\n<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>\n</ul>\n<ul>\n<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>\n</ul>\n<p><strong>Bonus points</strong></p>\n<ul>\n<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>\n</ul>\n<ul>\n<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer-facing experiences.</li>\n</ul>\n<ul>\n<li>You have started your own technology venture or were an early technical founder/employee. We value entrepreneurial spirit &amp; scrappiness!</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>The expected salary range for this role is $240,000 - $300,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b96ce52c-eaa","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8461600002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$240,000 - $300,000","x-skills-required":["Software engineering","Technical leadership","AI-powered product experiences","Data-driven mindset","Cross-functional collaboration","Full-stack engineering"],"x-skills-preferred":["Data platforms","Workflow automation","Customer lifecycle products","LLM-driven automation","Personalization"],"datePosted":"2026-04-18T15:44:15.497Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Software engineering, Technical leadership, AI-powered product experiences, Data-driven mindset, Cross-functional collaboration, Full-stack engineering, Data platforms, Workflow automation, Customer lifecycle products, LLM-driven automation, Personalization","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":240000,"maxValue":300000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8664b981-66c"},"title":"Data Platform Solutions Architect (Professional Services) - Emerging Enterprise & DNB","description":"<p>We&#39;re hiring for multiple roles within our Professional Services team. Depending on experience and scope, this position may be offered as a Senior Solutions Consultant or a Resident Solutions Architect. You may know this role as a Big Data Solutions Architect, Analytics Architect, Data Platform Architect, or Technical Consultant. The final title will align to your experience, technical depth, and customer-facing ownership.</p>\n<p>As a Data Platform Solutions Architect on our Professional Services team for the Emerging Enterprise &amp; Digital Natives business in EMEA, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Drive high-impact customer projects: Design and build reference architectures, implement production use cases, and create “how-to” guides tailored to the unique needs of fast-moving Emerging Enterprise &amp; Digital Native customers in EMEA.</li>\n</ul>\n<ul>\n<li>Collaborate on project scoping: Work closely with Engagement Managers and customers to define project scope, schedules, and deliverables for professional services engagements.</li>\n</ul>\n<ul>\n<li>Enable transformational initiatives: Guide strategic customers through their end-to-end big data journeys,migrating from legacy platforms and deploying industry-leading data and AI applications on the Databricks platform.</li>\n</ul>\n<ul>\n<li>Consult on architecture &amp; design: Provide thought leadership on solution design and implementation strategies, ensuring customers can successfully evaluate and adopt Databricks.</li>\n</ul>\n<ul>\n<li>Offer advanced support: Serve as an escalation point for operational issues, collaborating with Databricks Support and Engineering to resolve challenges quickly.</li>\n</ul>\n<ul>\n<li>Align technical delivery: Partner with cross-functional Databricks teams (Technical, PM, Architecture, and Customer Success) to align on milestones, ensuring customer needs and deadlines are met.</li>\n</ul>\n<ul>\n<li>Amplify product feedback: Provide implementation insights to Databricks Product and Support teams, guiding rapid improvements in features and troubleshooting for customers.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>Extensive experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n</ul>\n<ul>\n<li>Travel to customers 10% of the time</li>\n</ul>\n<ul>\n<li>[Preferred] Databricks Certification but not essential</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8664b981-66c","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8439047002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","performant end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":["Databricks Certification"],"datePosted":"2026-04-18T15:43:52.925Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, United Kingdom"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_72260a13-6ce"},"title":"Web Analytics & Insights Manager","description":"<p>Join the team as Twilio&#39;s next Web Analytics &amp; Insights Manager.</p>\n<p>This position is needed to partner in driving a scalable, holistic web analytics implementation strategy while maintaining data integrity of existing setup, ultimately ensuring Twilio.com web tracking capabilities continue to grow, are utilized by the company, and are accurate + reliable. The ideal candidate collaborates with several cross-functional teams at Twilio such as UX, CRO, Web Strategy, Web Dev, Demand Gen, SEO, Content+ more to deliver scalable and easy-to-use web analytics solutions and insights.</p>\n<p>As a subject matter expert in this space, you will grow your presence company-wide and eventually own relationships cross-functionally.</p>\n<p>We are looking for someone with proven experience in GA4 implementation, business/marketing analytics reporting, analysis and stakeholder communications.</p>\n<p>Responsibilities:</p>\n<p>Work closely with the Web team to deliver desired web analytics tracking for business stakeholders Own and maintain a holistic GA/web analytics strategy, implementation, troubleshooting, documentation + enablement Brainstorm with technical and non-technical stakeholders on creative ways to meet data requirements, while balancing site performance, holistic tagging strategies, and expanded data capabilities Drive continual web analytics capability improvements, and maintain high standards for data integrity Own and share recurring monthly and adhoc analysis, deep dives, storytelling to uncover the &#39;so what&#39; - providing actionable insights and recommendations across site, page, channel and A/B testing performance Strategize and build scalable solutions that support company-wide enablement and the utilization of web analytics for data-driven decision-making Partner with our CDP SMEs to orchestrate a unified data strategy, maximizing the integration between our CDP and Google Analytics to drive a 360-degree view of the customer journey</p>\n<p>Qualifications:</p>\n<ul>\n<li>5 years of experience in web analytics</li>\n<li>Expertise in Google Analytics 4, both implementation and analysis.</li>\n<li>Strong knowledge of JavaScript, HTML, and browser dev tools.</li>\n<li>Hands-on experience with Google Tag Manager (GTM)</li>\n<li>Familiarity with A/B and multivariate testing frameworks, as well as web personalization</li>\n<li>Strong communication skills with the ability to communicate effectively with both technical and non-technical audiences</li>\n<li>Detail-oriented with a passion for clean, trustworthy data.</li>\n<li>Strong analytical skills with a track record of independently exploring data, delivering insights, storytelling, and actionable recommendations.</li>\n<li>Excellent collaborator across cross functional teams; skilled in managing stakeholder projects</li>\n<li>Highly reliable and accountable.</li>\n<li>A self-starter.</li>\n<li>Interest or experience in applying AI-driven solutions to analytics workflows.</li>\n</ul>\n<p>Desired:</p>\n<ul>\n<li>Familiarity with Big Query, Adobe AEM, Adobe Target</li>\n<li>Interest or exposure to fraud monitoring tools and processes.</li>\n<li>Experience with Looker Studio, Tableau, or other BI tools.</li>\n<li>Working knowledge of Customer Data Platforms (CDPs) and analytics data integrations.</li>\n<li>A deeper understanding of:</li>\n</ul>\n<p>First party and third party cookies Cross domain tracking (including the limitations of web protocols and UTM strategies) Tagging and tracking strategies (e.g., when to use Segment → GA4 vs. GTM → GA4 for optimal efficiency) Ad pixel setup and utms deployment strategies</p>\n<p>Travel:</p>\n<p>We prioritize connection and opportunities to build relationships with our customers and each other. For this role, approximately 15% travel is anticipated to help you connect in-person in a meaningful way.</p>\n<p>What We Offer:</p>\n<p>Working at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.</p>\n<p>Compensation:</p>\n<p>*Please note this role is open to candidates outside of California, Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, New Jersey, New York, Vermont, Washington D.C., and Washington State. The information below is provided for candidates hired in those locations only. The estimated pay ranges for this role are as follows:</p>\n<p>Based in Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, Vermont or Washington D.C. : $106,320 - $132,900. Based in New York, New Jersey, Washington State, or California (outside of the San Francisco Bay area): $112,560 - $140,700. Based in the San Francisco Bay area, California: $125,040 - $156,300.</p>\n<p>This role may be eligible to participate in Twilio’s equity plan and corporate bonus plan. All roles are generally eligible for the following benefits: health care insurance, 401(k) retirement account, paid sick time, paid personal time off, paid parental leave. The successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location.</p>\n<p>Application deadline information:</p>\n<p>Applications for this role are intended to be accepted until April 30, 2026, but may change based on business needs.</p>\n<p>Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That&#39;s why we seek out colleagues who embody our values , something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you&#39;re ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn&#39;t what you&#39;re looking for, please consider other open positions.</p>\n<p>Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_72260a13-6ce","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Twilio","sameAs":"https://www.twilio.com/","logo":"https://logos.yubhub.co/twilio.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/twilio/jobs/7736718","x-work-arrangement":"remote","x-experience-level":null,"x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Google Analytics 4","JavaScript","HTML","Browser dev tools","Google Tag Manager","A/B and multivariate testing frameworks","Web personalization","Communication skills","Analytical skills","Data analysis","Insights","Storytelling","Actionable recommendations"],"x-skills-preferred":["Big Query","Adobe AEM","Adobe Target","Fraud monitoring tools","Looker Studio","Tableau","Customer Data Platforms","Analytics data integrations"],"datePosted":"2026-04-18T15:43:43.988Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Google Analytics 4, JavaScript, HTML, Browser dev tools, Google Tag Manager, A/B and multivariate testing frameworks, Web personalization, Communication skills, Analytical skills, Data analysis, Insights, Storytelling, Actionable recommendations, Big Query, Adobe AEM, Adobe Target, Fraud monitoring tools, Looker Studio, Tableau, Customer Data Platforms, Analytics data integrations"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2afc821d-248"},"title":"Resident Solutions Architect - Public Sector","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n<li>Provide an escalated level of support for customer operational issues.</li>\n<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Design and deployment of performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n<li>Documentation and white-boarding skills.</li>\n<li>Experience working with clients and managing conflicts.</li>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Databricks Certification</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2afc821d-248","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8494149002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems","Apache Spark","CI/CD","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:43:21.563Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Philadelphia, Pennsylvania"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_34a0bf55-11a"},"title":"Resident Solutions Architect - Communications, Media, Entertainment & Games","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Design and deployment of performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines</li>\n<li>Documentation and white-boarding skills</li>\n<li>Experience working with clients and managing conflicts</li>\n</ul>\n<p>Databricks Certification</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_34a0bf55-11a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8461222002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:43:17.113Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Boston, Massachusetts"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7d723067-22d"},"title":"Resident Solutions Architect - Healthcare & Life Sciences","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n</ul>\n<ul>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7d723067-22d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8494144002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","performant end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:43:01.843Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dallas, Texas"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7e5c6f46-bb6"},"title":"Resident Solutions Architect - Public Sector","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n</ul>\n<ul>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Databricks Certification</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7e5c6f46-bb6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8456975002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems","Apache Spark","CI/CD","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:43:01.352Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dallas, Texas"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b4a461d1-b6b"},"title":"Resident Solutions Architect - Public Sector","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n<li>Provide an escalated level of support for customer operational issues.</li>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Design and deployment of performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n<li>Documentation and white-boarding skills.</li>\n<li>Experience working with clients and managing conflicts.</li>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b4a461d1-b6b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8494128002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:50.996Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, D.C."}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_32d8d11d-9dc"},"title":"Resident Solutions Architect - Healthcare & Life Sciences","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n</ul>\n<ul>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Databricks Certification</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_32d8d11d-9dc","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8371312002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems","Apache Spark","CI/CD","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:37.300Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City, New York"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3e92e8a2-811"},"title":"Resident Solutions Architect - Public Sector","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n</ul>\n<ul>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Databricks Certification</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3e92e8a2-811","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8494130002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:35.247Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City, New York"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6da16fc8-e49"},"title":"Product Manager, Trip Quality Merchandising and AI","description":"<p>We&#39;re seeking an experienced product leader to own the guest-facing quality merchandising experience. As a Product Manager, you will join the team focused on trip quality, building features that close the gap between guest expectations and actual trip experiences.</p>\n<p>This team&#39;s work touches every single booking decision on a platform that facilitates hundreds of millions of trips and an incredibly diverse global inventory. You will ship and iterate on AI-powered features that help guests make more confident booking decisions , synthesizing signals from reviews, ratings, and listing data into quality information that is clear, personalized, and trustworthy.</p>\n<p>This means working hands-on with Data Science to evaluate model performance and translate outputs into better product decisions, not just defining requirements and handing them off. This role requires close partnership with the Policy, Privacy, and Legal teams , particularly on review integrity, AI-generated content, and trust-related guest-facing features.</p>\n<p>You will work especially closely with the quality data platform team to generate and consume accurate quality signals that foster a trusted marketplace, as well as across Design, Engineering, and Data Science to ship and iterate at scale.</p>\n<p>This is a critical role that spans the end-to-end guest experience around listing quality information, and the features you build will be part of the experience for every guest who considers booking on Airbnb.</p>\n<p>As a Product Manager, you will:</p>\n<ul>\n<li>Develop and execute the product roadmap for Trip Quality Merchandising and AI, with a focus on shipping AI-powered features that improve how guests evaluate listing quality.</li>\n<li>Define product requirements for generative AI features (e.g. listing summaries, quality highlights) and ML ranking systems, including evaluation criteria, guardrails, and iteration plans.</li>\n<li>Partner with Data Science to assess model performance , understanding where outputs are accurate and trustworthy, and where they fall short , and translate those assessments into concrete product improvements.</li>\n<li>Craft the product narrative that inspires teams, leadership, and the company and builds alignment on the quality merchandising strategy.</li>\n<li>Partner with design, engineering, and customer support to deliver features that improve both guest and host experiences.</li>\n<li>Collaborate closely with our Policy, Privacy, and Legal teams on topics that are essential to making Airbnb one of the most trusted marketplaces in the world.</li>\n</ul>\n<p>You will need to have:</p>\n<ul>\n<li>10+ years of product management experience.</li>\n<li>Hands-on track record shipping AI/ML-powered features on consumer products , including generative AI features (LLM-based summarization, content generation) and ML ranking or personalization systems , with direct involvement in defining evaluation criteria and improving model outputs.</li>\n<li>Expert ability to use data and business analysis to inform product strategy and drive decisions.</li>\n<li>Domain familiarity with marketplace trust/quality, search/ranking/relevance, or content/feedback systems is strongly preferred.</li>\n<li>Experience creating product messaging and delivering to customers.</li>\n<li>Demonstrated track record of building cross-functional and executive leadership alignment.</li>\n<li>Experience working on consumer products with marketplace dynamics on a global scale.</li>\n<li>Entrepreneurial track record of taking an idea to reality.</li>\n<li>Excited to lead cross-functional execution, from strategy through shipped product in a fast-paced environment.</li>\n<li>Desire to do individual contributor product management work.</li>\n<li>Excellent written and verbal communication; adept at simplifying complex AI/ML and data concepts for diverse audiences.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6da16fc8-e49","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Airbnb","sameAs":"https://www.airbnb.com/","logo":"https://logos.yubhub.co/airbnb.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/airbnb/jobs/7651661","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$224,000-$280,000 USD","x-skills-required":["product management","AI/ML","generative AI","ML ranking systems","data science","policy","privacy","legal","quality data platform","design","engineering","customer support"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:17.928Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"product management, AI/ML, generative AI, ML ranking systems, data science, policy, privacy, legal, quality data platform, design, engineering, customer support","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":224000,"maxValue":280000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4cd630c8-77d"},"title":"Resident Solutions Architect - Public Sector","description":"<p>Job Title: Resident Solutions Architect - Public Sector</p>\n<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers get the most value out of their data.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope various professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build, and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap or implement customer projects which lead to a customer&#39;s successful understanding, evaluation, and adoption of Databricks</li>\n<li>Provide an escalated level of support for customer operational issues</li>\n<li>Work with the Databricks technical team, Project Manager, Architect, and Customer team to ensure the technical components of the engagement are delivered to meet customer needs</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement-specific product and support issues</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Design and deployment of performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines</li>\n<li>Documentation and white-boarding skills</li>\n<li>Experience working with clients and managing conflicts</li>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Pay Range Transparency:</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range for this role is $180,656-$248,360 USD.</p>\n<p>About Databricks:</p>\n<p>Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics, and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</p>\n<p>Benefits:</p>\n<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, click here.</p>\n<p>Our Commitment to Diversity and Inclusion:</p>\n<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4cd630c8-77d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8494137002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems","Apache Spark","CI/CD","MLOps","end-to-end data architectures","technical project delivery","documentation","white-boarding"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:16.862Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Austin, Texas"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation, white-boarding","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_50d65da2-2e4"},"title":"Resident Solutions Architect - Healthcare & Life Sciences","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. Your responsibilities will include providing data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers get the most value out of their data.</p>\n<p>You will work on a variety of impactful customer technical projects, including designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases. You will also work with engagement managers to scope professional services work with input from the customer, guide strategic customers as they implement transformational big data projects, and consult on architecture and design.</p>\n<p>To be successful in this role, you will need to have 6+ years of experience in data engineering, data platforms, and analytics, be comfortable writing code in either Python or Scala, and have working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one. You should also have deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals.</p>\n<p>The pay range for this role is $180,656-$248,360 USD, and the total compensation package may also include eligibility for annual performance bonus, equity, and benefits.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_50d65da2-2e4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8494143002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms","analytics","Python","Scala","Cloud ecosystems","Apache Spark","distributed computing"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:13.802Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Chicago, Illinois"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms, analytics, Python, Scala, Cloud ecosystems, Apache Spark, distributed computing","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8d8b3af4-285"},"title":"Resident Solutions Architect - Public Sector","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n</ul>\n<ul>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n</ul>\n<ul>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n</ul>\n<ul>\n<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n</ul>\n<ul>\n<li>Provide an escalated level of support for customer operational issues.</li>\n</ul>\n<ul>\n<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n</ul>\n<ul>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n</ul>\n<ul>\n<li>Comfortable writing code in either Python or Scala</li>\n</ul>\n<ul>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n</ul>\n<ul>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n</ul>\n<ul>\n<li>Familiarity with CI/CD for production deployments</li>\n</ul>\n<ul>\n<li>Working knowledge of MLOps</li>\n</ul>\n<ul>\n<li>Design and deployment of performant end-to-end data architectures</li>\n</ul>\n<ul>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n</ul>\n<ul>\n<li>Documentation and white-boarding skills.</li>\n</ul>\n<ul>\n<li>Experience working with clients and managing conflicts.</li>\n</ul>\n<ul>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>\n</ul>\n<ul>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>Databricks Certification</p>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 2 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 3 Pay Range $180,656-$248,360 USD</p>\n<p>Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8d8b3af4-285","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8494147002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems","Apache Spark","CI/CD","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:10.634Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Atlanta, Georgia"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1d222227-15b"},"title":"Resident Solutions Architect - Financial Services","description":"<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>\n<li>Provide an escalated level of support for customer operational issues</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>9+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Capable of design and deployment of highly performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines</li>\n<li>Documentation and white-boarding skills</li>\n<li>Experience working with clients and managing conflicts</li>\n<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>\n<li>Travel to customers up to 20% of the time</li>\n</ul>\n<p>Nice to have:</p>\n<ul>\n<li>Databricks Certification</li>\n</ul>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $180,656-$248,360 USD Zone 2 Pay Range $180,656-$248,360 USD Zone 3 Pay Range $180,656-$248,360 USD Zone 4 Pay Range $180,656-$248,360 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1d222227-15b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8456969002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:42:03.482Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Chicago, Illinois"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6ac59b11-215"},"title":"Engineering Manager, Onboarding","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>\n<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>\n<p>What you’ll do</p>\n<p>You will lead an engineering team focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, and integrations workflows that help customers realize value quickly.</p>\n<p>This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>\n<p>The ideal candidate is an engineering leader with experience scaling user-facing onboarding systems, delivering high-quality product experiences, and partnering deeply across Product, Design, Operations, and GTM teams.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding experience.</li>\n</ul>\n<ul>\n<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>\n</ul>\n<ul>\n<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>\n</ul>\n<ul>\n<li>Drive execution of the Onboarding roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>\n</ul>\n<ul>\n<li>Lead and manage a team of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>\n</ul>\n<ul>\n<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>\n</ul>\n<ul>\n<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>\n</ul>\n<ul>\n<li>6+ years of software engineering experience with strong technical depth.</li>\n</ul>\n<ul>\n<li>3+ years of experience managing or leading engineers in a high-growth environment.</li>\n</ul>\n<ul>\n<li>Strong technical background and understanding of software development principles.</li>\n</ul>\n<ul>\n<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>\n</ul>\n<ul>\n<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>\n</ul>\n<ul>\n<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>\n</ul>\n<ul>\n<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>\n</ul>\n<p>Bonus points</p>\n<ul>\n<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>\n</ul>\n<ul>\n<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer facing experiences.</li>\n</ul>\n<ul>\n<li>You have started your own technology venture or were an early technical founder/employee.</li>\n</ul>\n<p>Experience Level: senior Employment Type: full-time Workplace Type: hybrid Category: Engineering Industry: Technology Salary Range: $240,000 CAD - $300,000 CAD Required Skills:</p>\n<ul>\n<li>Software engineering experience</li>\n<li>Technical leadership</li>\n<li>AI-powered product experiences</li>\n<li>Data-driven mindset</li>\n<li>Cross-functional collaboration</li>\n</ul>\n<p>Preferred Skills:</p>\n<ul>\n<li>Data platforms (Snowflake, Hex)</li>\n<li>Onboarding, implementation, identity, workflow automation</li>\n<li>Customer lifecycle products</li>\n<li>Technology venture founding/early employee experience</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6ac59b11-215","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8461597002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$240,000 CAD - $300,000 CAD","x-skills-required":["Software engineering experience","Technical leadership","AI-powered product experiences","Data-driven mindset","Cross-functional collaboration"],"x-skills-preferred":["Data platforms (Snowflake, Hex)","Onboarding, implementation, identity, workflow automation","Customer lifecycle products","Technology venture founding/early employee experience"],"datePosted":"2026-04-18T15:41:57.166Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Vancouver, British Columbia, Canada"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Software engineering experience, Technical leadership, AI-powered product experiences, Data-driven mindset, Cross-functional collaboration, Data platforms (Snowflake, Hex), Onboarding, implementation, identity, workflow automation, Customer lifecycle products, Technology venture founding/early employee experience","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":240000,"maxValue":300000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6fed2bb6-3b6"},"title":"Resident Solutions Architect - Public Sector","description":"<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>Your responsibilities will include:</p>\n<ul>\n<li>Designing and building reference architectures for customers</li>\n<li>Creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Guiding strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consulting on architecture and design; bootstrapping or implementing customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>\n<li>Providing an escalated level of support for customer operational issues</li>\n<li>Working with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>\n<li>Working with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>\n</ul>\n<p>To be successful in this role, you will need:</p>\n<ul>\n<li>6+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Design and deployment of performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines</li>\n<li>Documentation and white-boarding skills</li>\n<li>Experience working with clients and managing conflicts</li>\n<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>\n<li>Travel to customers 20% of the time</li>\n</ul>\n<p>The pay range for this role is $180,656-$248,360 USD per year, depending on location and experience.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6fed2bb6-3b6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8461321002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD per year","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","performant end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:41:52.838Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Chicago, Illinois"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_32a63583-c9b"},"title":"Sr. Analyst, Strategic Measurement","description":"<p>At Twilio, we&#39;re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences.\\n\\nOur dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day.\\n\\nAs we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding.\\n\\nJoin the team as Twilio’s next Sr. Analyst, Strategic Measurement.\\n\\nAbout the job\\n\\nWe are actively recruiting for this role to fill an existing vacancy. This position is needed to serve as the &quot;New Frontiers&quot; lead for the Marketing Strategy, Analytics &amp; Innovation team, focusing on ambiguous, highly strategic emerging channels, potential programs &amp; measurement optimization/instrumentation.\\n\\nYou will be responsible for defining, measuring, and accelerating our growth in new digital landscapes, specifically focusing on AI Engine Optimization (AEO), brand measurement and advanced attribution frameworks.\\n\\nPartnering closely with the global marketing analytics teams, Brand, Demand Gen, PR, Content, and Developer Network teams, you will execute against high-visibility objectives enabling marketing&#39;s business impact to executive leadership.\\n\\nIdeally, this position has a strong statistical background, a substantial capacity for hypothesis-driven thinking, and the ability to deliver &quot;unmistakable clarity&quot; in complex executive presentations.\\n\\nResponsibilities\\n\\nIn this role, you’ll:\\n\\nAnalyse LLM visibility and Share of Voice, create a continuous feedback loop across various teams to optimise Twilio&#39;s presence in AI search/AEO (answer engine optimisation), drive enablement and operational efforts to ensure consistent tool usage and storyline.\\n\\nSupport the strategy, data ingestion, and modelling required to build and maintain net new and existing brand measurement components. Drive a single, cohesive narrative combining disparate data sets.\\n\\nInspire experimentation adoption in teams. Build the frameworks and structure to drive experimentation rigor and clarity of incrementally of marketing initiatives.\\n\\nSupport Media Mix Modelling (MMM) and Multi-Touch Attribution (MTA) initiatives to understand the impact of various marketing channels on Sign Ups, SQLs, activations, and guide budget allocation.\\n\\nTranslate complex data sets into actionable insights and strategic recommendations for exec-level audiences.\\n\\nOwn enablement efforts for emerging channels, sharing accomplishments, KPIs, and cross-team collaborations broadly across the marketing organisation.\\n\\nQualifications\\n\\nTwilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply.\\n\\nIf your career is just starting or hasn’t followed a traditional path, don’t let that stop you from considering Twilio.\\n\\nWe are always looking for people who will bring something new to the table!\\n\\n* Required:\\n\\n5+ years of experience in statistical analysis, experimentation and/or scenario planning.\\n\\nA strong understanding of SEO and emerging AI Search (AEO/GEO) landscapes.\\n\\nA hypothesis-driven thinker who does not just report the “what,” but actively tests the “why”. Able to formulate testable guesses and distill the essential from complex marketing data.\\n\\nExperience with running or interpreting advanced measurement frameworks (Incrementality, MMM, MTA).\\n\\nHighly reliable and a self-starter.\\n\\nInterest or experience in applying AI-driven solutions to analytics workflows.\\n\\nStrong communication skills with the ability to communicate effectively with both technical and non-technical audiences\\n\\nDetail-oriented with a passion for clean, trustworthy data.\\n\\nStrong analytical skills.\\n\\nExcellent collaborator across cross-functional teams; skilled in managing stakeholder projects end-to-end.\\n\\nDesired:\\n\\nExperience supporting various stakeholder functions and their insights: Brand, PR, Social, Content, Demand Generation\\n\\nExperience with BI tools (Tableau, Looker Studio) and modelling (ROI, LTV:CAC, payback).\\n\\nInterest or exposure to fraud monitoring tools and processes.\\n\\nWorking knowledge of Customer Data Platforms (CDPs) and analytics data integrations.\\n\\nLocation\\n\\nThis role will be remote, and based in Alberta, British Columbia, or Ontario, Canada.\\n\\nTravel\\n\\nWe prioritise connection and opportunities to build relationships with our customers and each other.\\n\\nFor this role, approximately 15% travel is anticipated to help you connect in-person in a meaningful way.\\n\\nWhat We Offer\\n\\nWorking at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings programme, and much more.\\n\\nOfferings vary by location.\\n\\nBased on role, employees may also be eligible for additional compensation and benefits, including but not limited to incentive programmes, commissions, equity grants, health and wellness benefits, retirement contributions, and paid time off.\\n\\nThe estimated pay ranges for this role are as follows:\\n\\n$90,800 - $113,500 CAD\\n\\nTarget Bonus Percentage 12.50%\\n\\nThe successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location.\\n\\nTwilio thinks big. Do you?\\n\\nWe like to solve problems, take initiative, pitch in when needed, and are always up for trying new things.\\n\\nThat&#39;s why we seek out colleagues who embody our values , something we call Twilio Magic.\\n\\nAdditionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.\\n\\nSo, if you&#39;re ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!\\n\\nIf this role isn’t what you’re looking for, please consider other open positions.\\n\\nTwilio is proud to be an equal opportunity employer.\\n\\nWe do not discriminate based upon race, religion, colour, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics.\\n\\nWe also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.\\n\\nQualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act.\\n\\nAdditionally, Twilio participates in the E-Verify programme in certain locations, as required by law.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_32a63583-c9b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Twilio","sameAs":"https://www.twilio.com/","logo":"https://logos.yubhub.co/twilio.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/twilio/jobs/7736722","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$90,800 - $113,500 CAD","x-skills-required":["statistical analysis","experimentation","scenario planning","SEO","AI Search (AEO/GEO) landscapes","hypothesis-driven thinking","advanced measurement frameworks","Incrementality","Media Mix Modelling (MMM)","Multi-Touch Attribution (MTA)","communication skills","data analysis","clean, trustworthy data","analytical skills","collaboration","project management"],"x-skills-preferred":["BI tools (Tableau, Looker Studio)","modelling (ROI, LTV:CAC, payback)","fraud monitoring tools and processes","Customer Data Platforms (CDPs)","analytics data integrations"],"datePosted":"2026-04-18T15:41:42.004Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - Canada"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Marketing","industry":"Technology","skills":"statistical analysis, experimentation, scenario planning, SEO, AI Search (AEO/GEO) landscapes, hypothesis-driven thinking, advanced measurement frameworks, Incrementality, Media Mix Modelling (MMM), Multi-Touch Attribution (MTA), communication skills, data analysis, clean, trustworthy data, analytical skills, collaboration, project management, BI tools (Tableau, Looker Studio), modelling (ROI, LTV:CAC, payback), fraud monitoring tools and processes, Customer Data Platforms (CDPs), analytics data integrations","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":90800,"maxValue":113500,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_dc943b12-110"},"title":"Engineering Manager, Onboarding","description":"<p><strong>Job Title</strong></p>\n<p>Engineering Manager, Onboarding</p>\n<p><strong>About Us</strong></p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p><strong>Job Description</strong></p>\n<p>You will lead an engineering team focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, and integrations workflows that help customers realize value quickly. This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding experience.</li>\n<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>\n<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>\n<li>Drive execution of the Onboarding roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>\n<li>Lead and manage a team of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>\n<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>\n<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>\n<li>6+ years of software engineering experience with strong technical depth.</li>\n<li>3+ years of experience managing or leading engineers in a high-growth environment.</li>\n<li>Strong technical background and understanding of software development principles.</li>\n<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>\n<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>\n<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>\n<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>\n</ul>\n<p><strong>Bonus Points</strong></p>\n<ul>\n<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>\n<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer facing experiences.</li>\n<li>You have started your own technology venture or were an early technical founder/employee.</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>The expected salary range for this role is $240,000 - $300,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_dc943b12-110","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8461599002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$240,000 - $300,000","x-skills-required":["Software engineering","Leadership","Technical direction","Cross-functional collaboration","Data-driven decision making","AI-powered product experiences","LLM-driven automation","Personalization"],"x-skills-preferred":["Data platforms","Onboarding and implementation","Identity and workflow automation","Customer lifecycle products"],"datePosted":"2026-04-18T15:41:36.344Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Software engineering, Leadership, Technical direction, Cross-functional collaboration, Data-driven decision making, AI-powered product experiences, LLM-driven automation, Personalization, Data platforms, Onboarding and implementation, Identity and workflow automation, Customer lifecycle products","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":240000,"maxValue":300000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d0793a44-d91"},"title":"Resident Solutions Architect - Financial Services","description":"<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n<li>Provide an escalated level of support for customer operational issues.</li>\n<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>9+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Capable of design and deployment of highly performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n<li>Documentation and white-boarding skills.</li>\n<li>Experience working with clients and managing conflicts.</li>\n<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>\n<li>Travel to customers up to 20% of the time</li>\n</ul>\n<p>Nice to have:</p>\n<ul>\n<li>Databricks Certification</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d0793a44-d91","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8461328002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":["Databricks Certification"],"datePosted":"2026-04-18T15:41:30.682Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Charlotte, North Carolina"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5fd85b1e-563"},"title":"Resident Solutions Architect - Financial Services","description":"<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>\n<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>\n<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>\n<p>You will report to the regional Manager/Lead.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>\n<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>\n<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>\n<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>\n<li>Provide an escalated level of support for customer operational issues.</li>\n<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>\n<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>9+ years experience in data engineering, data platforms &amp; analytics</li>\n<li>Comfortable writing code in either Python or Scala</li>\n<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>\n<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>\n<li>Familiarity with CI/CD for production deployments</li>\n<li>Working knowledge of MLOps</li>\n<li>Capable of design and deployment of highly performant end-to-end data architectures</li>\n<li>Experience with technical project delivery - managing scope and timelines.</li>\n<li>Documentation and white-boarding skills.</li>\n<li>Experience working with clients and managing conflicts.</li>\n<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>\n<li>Travel to customers up to 20% of the time</li>\n<li>Nice to have: Databricks Certification</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5fd85b1e-563","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8456965002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,656-$248,360 USD","x-skills-required":["data engineering","data platforms & analytics","Python","Scala","Cloud ecosystems (AWS, Azure, GCP)","Apache Spark","CI/CD for production deployments","MLOps","design and deployment of highly performant end-to-end data architectures","technical project delivery","documentation and white-boarding skills","client management"],"x-skills-preferred":["Databricks Certification"],"datePosted":"2026-04-18T15:41:28.459Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dallas, Texas"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, data platforms & analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, design and deployment of highly performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180656,"maxValue":248360,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_faa865dc-a1d"},"title":"Senior Data Engineer, BizTech","description":"<p>We&#39;re seeking a hands-on expert to provide technical leadership in addressing BizTech&#39;s diverse data engineering needs and driving long-term strategies and best practices.</p>\n<p>As a Senior Data Engineer, you&#39;ll lead the design, implementation, and testing of data systems, from architecture to production. You&#39;ll build batch and real-time data systems that support business needs and critical products, ensuring data systems&#39; quality, performance, and stability through rigorous monitoring and quality assurance practices.</p>\n<p>You&#39;ll collaborate with cross-functional teams, including product managers, data scientists, and engineers, to develop scalable systems and drive data-driven decisions. You&#39;ll maintain strong partnerships with backend, data science, and machine learning teams to ensure seamless integration of data systems.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Leading the design, implementation, and testing of data systems, from architecture to production</li>\n<li>Building batch and real-time data systems that support business needs and critical products</li>\n<li>Ensuring data systems&#39; quality, performance, and stability through rigorous monitoring and quality assurance practices</li>\n<li>Collaborating with cross-functional teams to develop scalable systems and drive data-driven decisions</li>\n<li>Maintaining strong partnerships with backend, data science, and machine learning teams to ensure seamless integration of data systems</li>\n</ul>\n<p>We&#39;re looking for someone with 9+ years of relevant experience, a Bachelor&#39;s/Master&#39;s degree in CS/EE, and extensive experience in designing, building, and operating distributed data platforms. You should be proficient in Java, Scala, or Python, with strong skills in data processing and SQL querying. Proven track record of designing and optimizing batch and real-time data pipelines is a must.</p>\n<p>In addition to technical expertise, we&#39;re looking for someone with excellent written and verbal communication skills, with the ability to influence stakeholders and convey complex technical concepts. You should be a strong leader and mentor, with experience guiding teams on best practices and technical strategies.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_faa865dc-a1d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Airbnb","sameAs":"https://www.airbnb.com/","logo":"https://logos.yubhub.co/airbnb.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/airbnb/jobs/7640881","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Java","Scala","Python","data processing","SQL querying","distributed data platforms","batch and real-time data pipelines"],"x-skills-preferred":["machine learning","data science","backend development"],"datePosted":"2026-04-18T15:40:41.162Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bangalore, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Java, Scala, Python, data processing, SQL querying, distributed data platforms, batch and real-time data pipelines, machine learning, data science, backend development"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9b657c4e-8a1"},"title":"Member of Technical Staff - Data Platform","description":"<p><strong>About the Role</strong></p>\n<p>As a software engineer on the Data Platform team, you will design, build, and operate the distributed systems powering X&#39;s data movement and compute. You will take ownership of infrastructure components that process trillions of events daily, driving the scalability, performance, and reliability of the systems that power product and ML workloads across the company.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design and implement high-throughput, low-latency data ingestion and transport systems.</li>\n<li>Scale and optimize multi-tenant Kafka infrastructure supporting real-time workloads.</li>\n<li>Extend and tune Spark, Flink, and Trino for demanding production pipelines.</li>\n<li>Build interfaces, APIs, and pipelines enabling teams to query, process, and move data at petabyte scale.</li>\n<li>Debug and optimize distributed systems, with a focus on reliability and performance under load.</li>\n<li>Collaborate with ML, product, and infrastructure teams to unblock critical data workflows.</li>\n</ul>\n<p><strong>Basic Qualifications</strong></p>\n<ul>\n<li>Proven expertise in distributed systems, stream processing, or large-scale data platforms.</li>\n<li>Proficiency in Rust, Go, Scala or similar systems languages.</li>\n<li>Hands-on experience with Kafka, Flink, Spark, Trino, or Hadoop in production.</li>\n<li>Strong debugging, profiling, and performance optimization skills.</li>\n<li>Track record of shipping and maintaining critical infrastructure.</li>\n<li>Comfortable working in fast-moving, high-stakes environments with minimal guardrails.</li>\n</ul>\n<p><strong>Compensation and Benefits</strong></p>\n<p>$180,000 - $440,000 USD</p>\n<p>Base salary is just one part of our total rewards package at X, which also includes equity, comprehensive medical, vision, and dental coverage, access to a 401(k) retirement plan, short &amp; long-term disability insurance, life insurance, and various other discounts and perks.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9b657c4e-8a1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"xAI","sameAs":"https://www.x.ai/","logo":"https://logos.yubhub.co/x.ai.png"},"x-apply-url":"https://job-boards.greenhouse.io/xai/jobs/4803862007","x-work-arrangement":"hybrid","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$180,000 - $440,000 USD","x-skills-required":["distributed systems","stream processing","large-scale data platforms","Rust","Go","Scala","Kafka","Flink","Spark","Trino","Hadoop","debugging","profiling","performance optimization"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:40:03.394Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Palo Alto, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"distributed systems, stream processing, large-scale data platforms, Rust, Go, Scala, Kafka, Flink, Spark, Trino, Hadoop, debugging, profiling, performance optimization","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":440000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0e9a97ce-6d9"},"title":"Enterprise Account Executive, Manufacturing & Automotive","description":"<p>As an Enterprise Account Executive, Manufacturing &amp; Automotive based in Tokyo, you&#39;ll drive the adoption of safe, frontier AI technology across Japan&#39;s manufacturing and automotive industries. You&#39;ll own the full sales cycle from prospecting to closing, working with senior leaders at OEMs, tier-1 suppliers, and industrial manufacturers to help them accelerate R&amp;D, optimize production operations, and transform engineering and knowledge work with Anthropic&#39;s AI solutions.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Own and exceed revenue targets by winning new accounts and expanding relationships across Japan&#39;s major automotive OEMs, tier-1 and tier-2 suppliers, heavy industry manufacturers, electronics manufacturers, and industrial equipment companies</li>\n<li>Build and execute territory plans, identifying high-value opportunities across passenger vehicle and commercial vehicle OEMs, automotive parts suppliers, industrial machinery manufacturers, semiconductor and electronics producers, and materials/chemicals companies</li>\n<li>Lead consultative sales processes with senior stakeholders , including chief engineers, R&amp;D directors, heads of digital transformation, manufacturing engineering leaders, connected-vehicle and software-defined vehicle (SDV) leads, and plant operations executives , positioning AI as a driver for engineering productivity, design and simulation acceleration, supply chain intelligence, factory floor optimization, and in-vehicle/product experiences</li>\n<li>Orchestrate internal teams (Product, Engineering, Applied AI, Partnerships) to deliver solutions tailored to the rigorous safety, quality, and reliability standards of automotive and manufacturing workflows, while respecting IP protection and data sovereignty requirements</li>\n<li>Stay deeply informed about trends across Japan&#39;s manufacturing and automotive sectors and global industry shifts , including electrification, SDV, autonomous driving, Industry 4.0, and smart factory initiatives , ensuring Anthropic is consistently positioned as a relevant and forward-thinking partner to companies navigating rapid technological transformation</li>\n</ul>\n<p>You may be a good fit if you have:</p>\n<ul>\n<li>8+ years of enterprise sales experience in Japan with significant exposure to the manufacturing or automotive industry , whether selling directly into OEMs, tier-1 suppliers, industrial manufacturers, or adjacent technology partners (PLM, CAD/CAE, MES, industrial software, cloud/data platforms) within the ecosystem</li>\n<li>A genuine understanding of how product development, manufacturing operations, and engineering organizations work in Japan, including the distinct dynamics of keiretsu relationships, long-cycle OEM–supplier partnerships, and the coexistence of traditional monozukuri culture with digital transformation initiatives</li>\n<li>A track record of building trusted relationships with technically sophisticated engineering and manufacturing teams, navigating organizations where chief engineers, R&amp;D leaders, and plant/operations executives hold significant influence over technology decisions alongside traditional IT leadership</li>\n<li>Demonstrated ability to manage complex, multi-stakeholder sales cycles typical of automotive and manufacturing enterprises , where decisions often involve multiple business units, rigorous POC and evaluation processes, and long-horizon strategic planning</li>\n<li>Proven experience exceeding revenue targets by identifying and closing opportunities across a diverse portfolio of manufacturing and automotive companies with varying scales, business models, and digital maturity</li>\n<li>A knack for bringing order to chaos and an enthusiastic &#39;roll up your sleeves&#39; mentality , you are a true team player who thrives in ambiguous, startup-like environments</li>\n<li>A strategic, analytical approach to identifying opportunities within Japan&#39;s manufacturing and automotive sectors combined with relationship-focused execution that earns trust with engineering, manufacturing, and business leaders alike</li>\n<li>A passion for and/or experience with advanced AI systems. You feel strongly about ensuring frontier AI systems are developed safely and responsibly for broad benefit</li>\n<li>Excellent communication skills in both Japanese and English</li>\n</ul>\n<p>What We Offer:</p>\n<ul>\n<li>Competitive base salary and commission structure commensurate with experience</li>\n<li>Equity participation</li>\n<li>Comprehensive benefits package</li>\n<li>Hybrid work model with flexibility</li>\n<li>Access to cutting-edge AI technology and world-class research team</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0e9a97ce-6d9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.com","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5104760008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Enterprise sales experience","Manufacturing and automotive industry knowledge","Japanese language skills","English language skills","Advanced AI systems experience","Strategic and analytical thinking","Relationship-building skills","Complex sales cycle management"],"x-skills-preferred":["Keiretsu relationships","Long-cycle OEM-supplier partnerships","Monozukuri culture","Digital transformation initiatives","Cloud/data platforms","Industrial software","Supply chain intelligence","Factory floor optimization","In-vehicle/product experiences"],"datePosted":"2026-04-18T15:39:58.480Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Tokyo, Japan"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Automotive","skills":"Enterprise sales experience, Manufacturing and automotive industry knowledge, Japanese language skills, English language skills, Advanced AI systems experience, Strategic and analytical thinking, Relationship-building skills, Complex sales cycle management, Keiretsu relationships, Long-cycle OEM-supplier partnerships, Monozukuri culture, Digital transformation initiatives, Cloud/data platforms, Industrial software, Supply chain intelligence, Factory floor optimization, In-vehicle/product experiences"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7f904cf7-7bd"},"title":"Data Analyst II","description":"<p>Join us at Brex, the intelligent finance platform that empowers companies to spend smarter and move faster in over 200 markets. As a Data Analyst II, you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>\n<p>As a member of our Data organization, you will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses. This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</li>\n<li>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</li>\n<li>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</li>\n<li>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</li>\n<li>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</li>\n<li>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</li>\n<li>Contribute to the automation of recurring analyses and reporting workflows using Python.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>4+ years of experience in data analytics or a related role in a professional setting.</li>\n<li>3+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</li>\n<li>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</li>\n<li>Proficiency in Python for data analysis, automation, and scripting (Pandas, NumPy, and similar libraries).</li>\n<li>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</li>\n<li>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</li>\n<li>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</li>\n<li>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</li>\n</ul>\n<p>Bonus points:</p>\n<ul>\n<li>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</li>\n<li>Familiarity with dbt for data modeling and transformation.</li>\n<li>Exposure to data pipeline orchestration tools (e.g., Airflow).</li>\n<li>Experience in fintech, financial services, or payments.</li>\n<li>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7f904cf7-7bd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex LLC","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8463703002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Python","Business Intelligence","Data Visualization","Generative AI","LLM-based tools"],"x-skills-preferred":["Cloud data platforms","dbt","Data pipeline orchestration tools","Fintech, financial services, or payments"],"datePosted":"2026-04-18T15:39:28.984Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"São Paulo, São Paulo, Brazil"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, financial services, or payments"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7ffabac7-275"},"title":"Director, Solutions & Forward Deployed Engineering","description":"<p>We are seeking a Director, Solution &amp; Forward Deployed Engineering to lead the technical delivery of the Zus platform and help customers successfully connect their systems, data, and applications to Zus.</p>\n<p>Reporting to the Head of Customer Success &amp; Delivery, this role will own how customers integrate with the Zus platform. They’ll be responsible for ensuring healthcare organisations and digital health builders can reliably ingest data, connect EHR systems, and deploy applications powered by Zus APIs.</p>\n<p>This leader will oversee teams responsible for forward deployment engineering, and technical enablement, working closely with customer engineering teams to integrate Zus into production environments and connect to data networks.</p>\n<p>You will guide customers through the complexity of healthcare interoperability, helping them translate real-world workflows into scalable integrations built on Zus.</p>\n<p>This is a hands-on player-coach role. You will lead the team while also personally engaging in complex implementations, architecture discussions, and customer deployments.</p>\n<p>You will champion the use of AI tools, automation frameworks, and reusable integration patterns to dramatically improve how quickly and reliably customers connect to the Zus platform.</p>\n<p>The ideal candidate combines deep experience in healthcare interoperability, enterprise software implementations, API platforms, and AI-enabled engineering workflows with the leadership skills required to scale a delivery organisation.</p>\n<p>Key responsibilities:</p>\n<ul>\n<li>Lead implementation and technical delivery - Own the technical delivery lifecycle following contract signature through production deployment and early adoption</li>\n<li>Lead and grow a team of Solutions Engineers and Forward Deployed Engineers - Develop career paths, performance expectations, and development plans for the team to ensure excellent execution of goals</li>\n<li>Ensure consistent, high-quality execution across multiple concurrent enterprise implementations</li>\n<li>Establish best practices for onboarding, implementation, integration, and go-live readiness</li>\n<li>Set customers up for success across multiple different high priority use cases</li>\n<li>Ensure customers achieve rapid time-to-value from the Zus platform</li>\n</ul>\n<p>Act as player-coach for complex implementations - Personally engage on strategic or technically complex customer deployments</p>\n<p>Guide integrations involving FHIR, HL7, CCD, APIs, SFTP pipelines, and EHR platforms</p>\n<p>Troubleshoot complex interoperability and data pipeline issues</p>\n<p>Work directly with engineering teams to deploy and operationalize Zus products</p>\n<p>Serve as a trusted technical advisor to customer technical and operational stakeholders</p>\n<p>Drive forward deployed engineering - Support customers in building production-grade applications and workflows on top of Zus APIs</p>\n<p>Help customers operationalize clinical and operational data across care delivery workflows</p>\n<p>Lead the development of reference architectures and deployment patterns</p>\n<p>Identify integration opportunities that accelerate product adoption and expansion</p>\n<p>Delivery training and technical enablement - Oversee technical onboarding and training programs for new customers</p>\n<p>Enable customer engineering and product teams to effectively build on the Zus platform</p>\n<p>Develop documentation, workshops, and enablement resources for technical users</p>\n<p>Drive AI-enabled implementation and automation - Lead the adoption of AI tools and automation frameworks across the delivery organisation</p>\n<p>Identify opportunities to automate manual implementation work using LLMs, scripting, and developer tooling</p>\n<p>Develop reusable automation patterns for all parts of the Zus ecosystem</p>\n<p>Help customers leverage Zus data to power AI-enabled workflows and analytics applications</p>\n<p>Partner with Product and Engineering - Translate customer implementation patterns into platform improvement</p>\n<p>Participate in technical discussions to find reusable integration patterns that can be embedded directly into the Zus platform</p>\n<p>Communicate customer needs to the Product &amp; Engineering teams</p>\n<p>You&#39;re a good fit because you have:</p>\n<ul>\n<li>10+ years of experience in technical implementation, solutions engineering, systems integration, or professional services leadership, preferably in healthtech, SaaS, or enterprise software</li>\n<li>Proven experience leading customer-facing teams and scaling implementation or professional services functions</li>\n<li>Deep expertise in healthcare data interoperability, including FHIR, HL7, CCD, and EHR integrations</li>\n<li>Strong understanding of APIs, data ingestion pipelines (ETL, JSON, CSV), and modern data platforms (e.g., Snowflake)</li>\n<li>Experience designing scalable implementation frameworks and reusable integration patterns</li>\n<li>Familiarity with secure environments and compliance frameworks (HIPAA, SOC 2)</li>\n<li>Executive presence and the ability to build trust with both technical and non-technical stakeholders</li>\n<li>Strong strategic thinking paired with a willingness to dive into complex technical or delivery challenges when needed</li>\n<li>A self-starter mindset and comfort operating in a fast-paced, evolving startup environment</li>\n<li>Passion for improving healthcare through better access to and use of data</li>\n<li>Willingness to travel up to ~25% for customer engagements, industry events, and company meetings</li>\n<li>Bachelor’s degree in Business, Engineering, or a related field (advanced degree a plus)</li>\n</ul>\n<p>Additional Information:</p>\n<p>We will offer you...</p>\n<ul>\n<li>Competitive compensation that reflects the value you bring to the team a combination of cash and equity</li>\n<li>Robust benefits that include health insurance, wellness benefits, 401k with a match, unlimited PTO</li>\n<li>Opportunity to work alongside a passionate team that is determined to help change the world (and have fun doing it)</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7ffabac7-275","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Zus","sameAs":"https://zus.com/","logo":"https://logos.yubhub.co/zus.com.png"},"x-apply-url":"https://jobs.lever.co/zushealth/de7b4911-901f-4548-9d68-9b77c0ccf6b6","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$150,000-200,000 per year","x-skills-required":["Healthcare data interoperability","Enterprise software implementations","API platforms","AI-enabled engineering workflows","Leadership skills","FHIR","HL7","CCD","EHR integrations","APIs","Data ingestion pipelines","Modern data platforms","Scalable implementation frameworks","Reusable integration patterns","Secure environments","Compliance frameworks","Executive presence","Strategic thinking","Self-starter mindset","Passion for improving healthcare"],"x-skills-preferred":[],"datePosted":"2026-04-17T13:13:25.945Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"Healthcare data interoperability, Enterprise software implementations, API platforms, AI-enabled engineering workflows, Leadership skills, FHIR, HL7, CCD, EHR integrations, APIs, Data ingestion pipelines, Modern data platforms, Scalable implementation frameworks, Reusable integration patterns, Secure environments, Compliance frameworks, Executive presence, Strategic thinking, Self-starter mindset, Passion for improving healthcare","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":150000,"maxValue":200000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_facf5d80-7bd"},"title":"Solutions Engineer, Delivery & Automation","description":"<p>We&#39;re looking for a Solutions Engineer who gets energized by solving gnarly technical problems and making customers wildly successful. As the technical quarterback for new customer onboardings, you&#39;ll translate their vision into working integrations, navigate the chaos of healthcare data standards, and ensure they extract real value from day one.</p>\n<p>Key responsibilities:</p>\n<p>Own the technical journey - Lead end-to-end onboarding for new customers,from authentication setup to data mart configuration</p>\n<p>Integrate customer systems with Zus (APIs, SFTP, HL7, FHIR,the whole interoperability stack)</p>\n<p>Translate messy business requirements into clean technical architectures</p>\n<p>Build and maintain automated workflows that make implementations faster and more reliable</p>\n<p>Drive customer success through technical excellence - Be the trusted technical advisor customers call when things get complicated</p>\n<p>Run technical deep dives and implementation reviews that actually move the needle</p>\n<p>Identify integration risks before they become blockers and solve them proactively</p>\n<p>Train customers on best practices so they become power users, not support tickets</p>\n<p>Innovate on process - Use AI tools (LLMs, automation platforms, scripting) to eliminate manual work and scale your impact</p>\n<p>Build templates, scripts, and tooling that make the 10th implementation faster than the 1st</p>\n<p>Document learnings and create repeatable playbooks through automation that make the whole team better</p>\n<p>Collaborate with R&amp;D - Partner closely with Product and Engineering to surface integration challenges and opportunities for platform improvement</p>\n<p>Translate real-world customer integration patterns into product feedback and roadmap insights</p>\n<p>Collaborate with R&amp;D teams on emerging capabilities around AI, data pipelines, and developer tooling</p>\n<p>Act as the voice of the customer when identifying opportunities to improve developer experience and reduce integration friction</p>\n<p>You&#39;ll enjoy solving messy integration challenges, building automation that eliminates manual work, and partnering closely with Product and Engineering to continuously improve the platform.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_facf5d80-7bd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Zus","sameAs":"https://zus.com/","logo":"https://logos.yubhub.co/zus.com.png"},"x-apply-url":"https://jobs.lever.co/zushealth/fbe45c72-4269-4c7f-b88c-6df3349c2479","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$125,000-165,000 per year","x-skills-required":["healthcare data standards (FHIR, HL7, CCD)","major EMRs (Epic, Cerner, athenahealth)","API and data pipeline experience (ETL, REST APIs, JSON, CSV ingestion)","data platforms (Snowflake, SQL databases) including schema design and query optimization","Python scripting skills and SQL fluency","secure environments and compliance (HIPAA, SOC2)"],"x-skills-preferred":["AI tools (LLMs, automation platforms, scripting)","data pipelines","developer tooling"],"datePosted":"2026-04-17T13:12:29.884Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"healthcare data standards (FHIR, HL7, CCD), major EMRs (Epic, Cerner, athenahealth), API and data pipeline experience (ETL, REST APIs, JSON, CSV ingestion), data platforms (Snowflake, SQL databases) including schema design and query optimization, Python scripting skills and SQL fluency, secure environments and compliance (HIPAA, SOC2), AI tools (LLMs, automation platforms, scripting), data pipelines, developer tooling","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":125000,"maxValue":165000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6acd8036-5ec"},"title":"Platform Engineer (Databases & Storage)","description":"<p>We are looking for a Staff Platform Engineer to own the database and storage foundation of World Labs. This is a high-impact systems role at the intersection of databases, distributed systems, and AI infrastructure. You will define how core data systems are designed, scaled, and operated in an environment where workloads are evolving quickly and requirements are often ambiguous.</p>\n<p>Your responsibilities will include owning the design and evolution of the transactional systems that power the platform, defining architecture for database and storage systems under high-throughput, low-latency workloads, making and driving decisions around data modeling, indexing, replication, and consistency, debugging and resolving complex production issues, establishing standards for reliability, observability, and operability across the platform, partnering with product and research teams to support evolving and often ambiguous requirements, driving improvements in performance, scalability, and cost across the system, mentoring engineers and raising the bar for system design and technical decision-making.</p>\n<p>Key qualifications include 10+ years of experience building and operating production systems at scale, with ownership of critical infrastructure, strong experience designing and operating transactional systems and databases, deep understanding of data modeling, indexing, transactions, concurrency, and consistency tradeoffs, experience owning systems with strict reliability and performance requirements in production, strong experience debugging complex production issues and reasoning about failure modes, experience designing distributed systems or large-scale infrastructure where tradeoffs are non-trivial, proven ability to define architecture and drive technical decisions end-to-end, strong judgment in balancing performance, reliability, and cost, ability to operate effectively in ambiguous, fast-moving environments with high ownership.</p>\n<p>Preferred qualifications include experience with database internals, storage systems, or query engines, experience building infrastructure for AI/ML systems or data platforms, experience in early-stage or high-growth environments.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6acd8036-5ec","directApply":true,"hiringOrganization":{"@type":"Organization","name":"World Labs","sameAs":"https://www.worldlabs.ai","logo":"https://logos.yubhub.co/worldlabs.ai.png"},"x-apply-url":"https://job-boards.greenhouse.io/worldlabs/jobs/4194381009","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$200-$300k base salary (good-faith estimate for San Francisco Bay Area upon hire; actual offer based on experience, skills, and qualifications)","x-skills-required":["database internals","storage systems","query engines","data modeling","indexing","transactions","concurrency","consistency","distributed systems","large-scale infrastructure"],"x-skills-preferred":["AI/ML systems","data platforms"],"datePosted":"2026-04-17T13:09:33.493Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"database internals, storage systems, query engines, data modeling, indexing, transactions, concurrency, consistency, distributed systems, large-scale infrastructure, AI/ML systems, data platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":300000,"maxValue":300000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_93a43345-780"},"title":"FinOps Program Manager","description":"<p>We believe that the way people interact with their finances will drastically improve in the next few years. We&#39;re dedicated to empowering this transformation by building the tools and experiences that thousands of developers use to create their own products. Plaid powers the tools millions of people rely on to live a healthier financial life.</p>\n<p>The FinOps function is responsible for financial accountability, visibility, and optimization across all engineering-related spend at Plaid. This includes cloud infrastructure, AI/ML and data workloads, third-party SaaS tools, and other technical investments that support Plaid&#39;s products and internal platforms.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Monitors and analyzes engineering spend across cloud, AI/ML, data platforms, and SaaS, identifying trends, anomalies, and optimization opportunities.</li>\n<li>Builds and maintains forecasts for engineering spend, partnering with Finance and engineering leaders to understand drivers, assumptions, and risks.</li>\n<li>Partners with engineering, product, and TPMs to incorporate cost considerations into roadmaps, architectural decisions, and execution plans.</li>\n<li>Leads cost optimization initiatives, such as rightsizing, commitment strategies, and workload efficiency improvements, in collaboration with engineering owners.</li>\n<li>Creates and maintains dashboards and reporting that make spend understandable and actionable for both engineers and executives.</li>\n<li>Implements FinOps practices and processes, including showback/chargeback models, unit economics, and cost ownership frameworks.</li>\n<li>Partners on tooling and automation, working with data and engineering teams to improve cost visibility, forecasting accuracy, and operational efficiency.</li>\n<li>Drives alignment and behavior change, helping teams balance cost, performance, reliability, and velocity through data-driven decision making.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>6–10+ years of relevant experience working at the intersection of engineering, infrastructure, data, or finance in a cloud-native or SaaS environment.</li>\n<li>Proven experience partnering closely with engineering teams to influence decisions involving cloud infrastructure, data platforms, AI/ML workloads, or SaaS spend.</li>\n<li>Working understanding of modern cloud-native architectures, including core components such as compute, storage, networking, data pipelines, and managed services,enough to engage credibly with engineers on design, tradeoffs, and cost drivers.</li>\n<li>Strong foundation in cost analysis, forecasting, budgeting, and variance management, with the ability to translate data into clear, actionable insights.</li>\n<li>Comfort working directly with data, including writing SQL (or effectively using AI-assisted tools to do so) to explore datasets, validate assumptions, and answer ad hoc questions.</li>\n<li>Experience building clear, high-quality dashboards and BI artifacts that are not only accurate, but intuitive and delightful for engineers and leaders to use.</li>\n<li>Demonstrated success driving adoption and behavior change,embedding cost awareness into day-to-day engineering workflows, not just producing reports.</li>\n<li>Experience owning and delivering cross-functional programs end-to-end, often without direct authority or a dedicated team.</li>\n<li>Familiarity with FinOps principles and practices (e.g., shared ownership, showback/chargeback, unit economics, optimization strategies).</li>\n<li>Strong communication skills, with the ability to tailor complex technical and financial concepts for engineering, finance, and executive audiences.</li>\n</ul>\n<p><strong>Nice to Haves</strong></p>\n<ul>\n<li>Hands-on familiarity with cloud cost management tools (e.g., AWS Cost Explorer, GCP Billing, Azure Cost Management, CloudHealth, Cloudability, or similar).</li>\n<li>Experience working with or supporting data platforms and AI/ML workloads, including understanding cost drivers for batch processing, streaming, storage, and model training/inference.</li>\n<li>Exposure to showback/chargeback models, cost allocation strategies, or product-level unit economics.</li>\n<li>Experience improving data models or pipelines that support analytics, reporting, or financial attribution.</li>\n<li>Familiarity with BI tools such as Mode, Tableau, Looker, or similar,and a strong eye for dashboard usability and design.</li>\n<li>Background in a technical role (e.g., engineering, TPM, infra, data, or engineering operations) before moving into a more cross-functional or business-oriented position.</li>\n<li>Experience operating in a high-growth or rapidly scaling environment, where cost structures and investment priorities are evolving quickly.</li>\n</ul>\n<p><strong>Additional Information</strong></p>\n<p>Our mission at Plaid is to unlock financial freedom for everyone. To support that mission, we seek to build a diverse team of driven individuals who care deeply about making the financial ecosystem more equitable. We recognize that strong qualifications can come from both prior work experiences and lived experiences. We encourage you to apply to a role even if your experience doesn&#39;t fully match the job description.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_93a43345-780","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Plaid","sameAs":"https://plaid.com/","logo":"https://logos.yubhub.co/plaid.com.png"},"x-apply-url":"https://jobs.lever.co/plaid/acb399b1-e0f8-45f3-bffa-c89c9c573a12","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$172,800-$259,200 per year","x-skills-required":["cloud infrastructure","AI/ML","data platforms","SaaS","cost analysis","forecasting","budgeting","variance management","SQL","data visualization","dashboard creation","cross-functional program management","FinOps principles","showback/chargeback models","unit economics","optimization strategies"],"x-skills-preferred":["cloud cost management tools","data platforms and AI/ML workloads","cost allocation strategies","product-level unit economics","BI tools","technical role background","high-growth environment experience"],"datePosted":"2026-04-17T12:52:18.112Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"cloud infrastructure, AI/ML, data platforms, SaaS, cost analysis, forecasting, budgeting, variance management, SQL, data visualization, dashboard creation, cross-functional program management, FinOps principles, showback/chargeback models, unit economics, optimization strategies, cloud cost management tools, data platforms and AI/ML workloads, cost allocation strategies, product-level unit economics, BI tools, technical role background, high-growth environment experience","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":172800,"maxValue":259200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_08842cf0-459"},"title":"Account Executive - Enterprise","description":"<p>About Mistral AI</p>\n<p>At Mistral AI, we believe in the power of AI to simplify tasks, save time, and enhance learning and creativity. Our technology is designed to integrate seamlessly into daily working life.</p>\n<p>We are a global company with teams distributed between France, USA, UK, Germany, and Singapore. Our diverse workforce thrives in competitive environments and is committed to driving innovation.</p>\n<p>Role Summary</p>\n<p>As an Enterprise Account Executive, you will play a pivotal role in driving Mistral&#39;s adoption among our most strategic enterprise customers. We are looking for highly action-oriented individuals who thrive in fast-paced environments, love making deals, and have a strong bias for execution.</p>\n<p>Responsibilities</p>\n<p>Lead Development (Strategic Outbound &amp; Qualified Inbound)</p>\n<ul>\n<li>Drive strategic outreach &amp; leverage introductions to engage high-potential enterprise customers</li>\n<li>Convert inbound opportunities into high-value partnerships, including upsells and bespoke enterprise agreements</li>\n<li>Build and maintain a strong pipeline of qualified opportunities</li>\n</ul>\n<p>Value Proposition Validation</p>\n<ul>\n<li>Support enterprise customers through Proof of Concept (POC) phases, ensuring a smooth and impactful evaluation process</li>\n<li>Translate successful evaluations into long-term production contracts by demonstrating clear ROI and business impact</li>\n<li>Align Mistral&#39;s capabilities with customer strategic priorities</li>\n</ul>\n<p>Deal Management &amp; Closing</p>\n<ul>\n<li>Develop and execute strategic sales plans to convert leads into long-term customers</li>\n<li>Act as the primary point of contact for external stakeholders throughout the entire sales cycle</li>\n<li>Lead negotiations and collaborate with legal, technical, and implementation teams to finalize agreements</li>\n</ul>\n<p>Executive Engagement</p>\n<ul>\n<li>Build strong relationships with C-level executives, innovation leaders, and senior decision-makers</li>\n<li>Understand customer strategic priorities and position Mistral&#39;s AI capabilities as a critical enabler of their initiatives</li>\n<li>Guide executive stakeholders through complex technology adoption decisions</li>\n</ul>\n<p>Technical Collaboration</p>\n<ul>\n<li>Develop a strong understanding of Mistral&#39;s AI platform and technical capabilities</li>\n<li>Work closely with implementation and engineering teams to address technical questions and ensure successful deployments</li>\n</ul>\n<p>Training &amp; Enablement</p>\n<ul>\n<li>Share customer insights internally to inform product development and strategy</li>\n<li>Help internal teams better understand enterprise customer needs and market opportunities</li>\n</ul>\n<p>Who You Are</p>\n<ul>\n<li><p>Consulting or Strategic Background</p>\n</li>\n<li><p>Experience in strategy consulting (e.g., McKinsey, BCG, Bain, or similar firms)</p>\n</li>\n<li><p>Strong ability to structure complex problems and translate strategy into action</p>\n</li>\n</ul>\n<p>Commercial Mindset</p>\n<ul>\n<li>Highly action-oriented with a strong bias toward execution</li>\n<li>Passion for deal-making and turning opportunities into closed agreements</li>\n<li>Ability to operate in fast-paced and ambiguous environments</li>\n</ul>\n<p>Executive Presence</p>\n<ul>\n<li>Comfortable engaging and influencing C-level executives</li>\n<li>Strong communication and storytelling skills</li>\n</ul>\n<p>Nice to Have</p>\n<ul>\n<li>Experience working on large transformation projects (AI, infrastructure, telecom, energy, or digital transformation)</li>\n<li>Familiarity with AI, data platforms, or enterprise software ecosystems</li>\n<li>Ideally had a first role in sales; business development, deal closing roles</li>\n</ul>\n<p>What We Offer</p>\n<ul>\n<li>Competitive cash salary and equity</li>\n<li>Food: Daily lunch vouchers</li>\n<li>Sport: Monthly contribution to a Gym pass subscription</li>\n<li>Transportation: Monthly contribution to a mobility pass</li>\n<li>Health: Full health insurance for you and your family</li>\n<li>Parental: Generous parental leave policy</li>\n<li>Visa sponsorship</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_08842cf0-459","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mistral AI","sameAs":"https://mistral.ai","logo":"https://logos.yubhub.co/mistral.ai.png"},"x-apply-url":"https://jobs.lever.co/mistral/2a357282-9d44-4b41-a249-c75ffe878ce2","x-work-arrangement":"hybrid","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["strategy consulting","deal-making","execution","complex problem-solving","communication","storytelling","influencing C-level executives"],"x-skills-preferred":["large transformation projects","AI","data platforms","enterprise software ecosystems"],"datePosted":"2026-04-17T12:45:50.976Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Paris"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"strategy consulting, deal-making, execution, complex problem-solving, communication, storytelling, influencing C-level executives, large transformation projects, AI, data platforms, enterprise software ecosystems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3e6de12f-61a"},"title":"Senior Technical Program Manager","description":"<p>As Mercury grows, our revenue platforms increasingly sit at the intersection of product launches, data systems, go-to-market motion, and external vendors. We&#39;re hiring a Technical Program Manager for Revenue Technology to ensure that complex, multi-team initiatives actually move: in the right order, with the right dependencies surfaced, and with real progress visible week over week.</p>\n<p>This is not a status-meeting role. This is a technical, hands-on TPM who understands systems well enough to unblock work directly , whether that&#39;s in Linear, Salesforce, data workflows, or coordination across Product, Engineering, Data, and Revenue.</p>\n<p>In this role, you will report to the Head of Platforms &amp; Infrastructure and play a critical role in protecting execution focus so specialized technical roles can operate at full effectiveness.</p>\n<p>Some things you&#39;ll do on the job:</p>\n<ul>\n<li>Own delivery sequencing and execution momentum across Revenue Technology initiatives</li>\n<li>Translate ambiguous goals into clear scopes, milestones, and dependency maps</li>\n<li>Actively manage work in Linear and related tools, ensuring priorities are visible and intentional</li>\n<li>Identify when work is blocked, drifting, over-scoped, or colliding , and intervene early</li>\n<li>Partner closely with Solution Architecture, Data Strategy, Engineering, and Systems &amp; Workflow Experience</li>\n<li>Perform light hands-on technical work (configuration, stopgaps, tooling hygiene) to unblock progress</li>\n<li>Absorb coordination load across Product, Engineering, Data Engineering, Data Science, Revenue, and vendors</li>\n<li>Drive planning cycles anchored to Cosmic Objectives, not ad-hoc urgency</li>\n<li>Make tradeoffs explicit and documented so teams can move without re-litigating decisions</li>\n</ul>\n<p>You should have:</p>\n<ul>\n<li>5+ years experience as a Technical Program Manager, Technical PM, or equivalent role in a SaaS or fintech environment</li>\n<li>Demonstrated ability to drive delivery across multi-team, cross-functional initiatives (Product, Engineering, Data, Revenue, vendors)</li>\n<li>Strong technical fluency , able to reason about systems, data flows, integrations, and platform constraints</li>\n<li>Hands-on experience managing work in tools like Linear, Jira, or similar, with a bias toward clarity and momentum</li>\n<li>Proven ability to surface dependencies, sequence work, and intervene when execution is blocked or drifting</li>\n<li>Excellent written and verbal communication skills, especially in ambiguous environments</li>\n<li>Comfort performing light hands-on technical work (tool configuration, workflow setup, stopgap solutions)</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Experience supporting GTM, revenue, or customer lifecycle platforms</li>\n<li>Familiarity with Salesforce, data platforms, or revenue tooling ecosystems</li>\n<li>Experience operating within Agile / Kanban systems , with judgment about when to adapt them</li>\n<li>Background working closely with architecture or platform engineering teams</li>\n</ul>\n<p>The total rewards package at Mercury includes base salary, equity (stock options), and benefits. Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry. New hire offers are made based on a candidate’s experience, expertise, geographic location, and internal pay equity relative to peers.</p>\n<p>Our target new hire base salary ranges for this role are the following:</p>\n<ul>\n<li>US employees in New York City, Los Angeles, Seattle, or the San Francisco Bay Area: $131,000 - 148,000</li>\n<li>US employees outside of New York City, Los Angeles, Seattle, or the San Francisco Bay Area: $118,000 - $133,200</li>\n<li>Canadian employees (any location): CAD $124,300 - $139,900</li>\n</ul>\n<p>Mercury values diversity &amp; belonging and is proud to be an Equal Employment Opportunity employer. All individuals seeking employment at Mercury are considered without regard to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, sexual orientation, or any other legally protected characteristic.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3e6de12f-61a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mercury","sameAs":"https://www.mercury.com/","logo":"https://logos.yubhub.co/mercury.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mercury/jobs/5856800004","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$118,000 - $148,000","x-skills-required":["Technical Program Management","Revenue Technology","Linear","Salesforce","Data Workflows","Solution Architecture","Data Strategy","Engineering","Systems & Workflow Experience"],"x-skills-preferred":["GTM","Revenue","Customer Lifecycle Platforms","Data Platforms","Revenue Tooling Ecosystems","Agile/Kanban Systems"],"datePosted":"2026-04-17T12:45:37.683Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Fintech","skills":"Technical Program Management, Revenue Technology, Linear, Salesforce, Data Workflows, Solution Architecture, Data Strategy, Engineering, Systems & Workflow Experience, GTM, Revenue, Customer Lifecycle Platforms, Data Platforms, Revenue Tooling Ecosystems, Agile/Kanban Systems","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":118000,"maxValue":148000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_782f7e7f-e0e"},"title":"Revenue Technology - Data Strategy & Operations Lead","description":"<p>We&#39;re looking for a Data Strategy &amp; Operations leader to own the data foundations that power revenue execution. This role ensures that revenue data is reliable, interpretable, scalable, and usable as the business evolves and that teams can act on what they see with confidence.</p>\n<p>In this role, you will report to the Head of Platforms &amp; Infrastructure and play a central role in shaping how Mercury models, governs, and operationalizes GTM data. You’ll partner closely with Data Engineering, Data Science, Solution Architecture, Platform Engineering, etc.</p>\n<p>Some key responsibilities include:</p>\n<ul>\n<li>Owning the definition, structure, and reliability of data originating from revenue platforms (e.g., Salesforce, GTM tools, automation systems)</li>\n<li>Serving as the primary decision owner for GTM-sourced tables and views used for revenue execution, forecasting inputs, lifecycle tracking, and signal-based workflows</li>\n<li>Designing and evolving core GTM data models across Salesforce, ETL, and analytics layers</li>\n<li>Partnering with Data Engineering to align GTM schemas with enterprise data models and define clear data contracts between source systems and downstream consumers</li>\n<li>Partnering with Data Science / Analytics to ensure revenue data is interpretable, statistically sound, and reflects how the business actually operates</li>\n<li>Owning clarity around data ownership boundaries, shared dependencies, and escalation paths when upstream or downstream changes impact revenue integrity</li>\n<li>Defining and upholding data quality, freshness, consistency, and documentation standards for revenue platforms</li>\n<li>Monitoring and improving pipeline reliability, performance, and scalability, proactively identifying fragile or redundant transformations</li>\n<li>Identifying opportunities to automate manual or error-prone data workflows and reduce operational overhead</li>\n<li>Acting as a data thought partner to Platforms &amp; Infrastructure, Revenue Operations, Analytics, and Security , advising on feasibility, tradeoffs, and sequencing for data-heavy initiatives</li>\n</ul>\n<p>You should have:</p>\n<ul>\n<li>7+ years of experience in data engineering or data systems roles within SaaS or technology companies</li>\n<li>Deep experience designing and operating production data pipelines</li>\n<li>Highly proficient in SQL and experienced in data modeling</li>\n<li>Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery, Redshift)</li>\n<li>Experience with ETL / ELT tooling (e.g., dbt, Airflow, Census, or similar)</li>\n<li>Understanding of Salesforce data models and common GTM system architectures</li>\n<li>Ability to translate business concepts into durable, well-structured data models</li>\n<li>Clear communication skills with both technical and non-technical partners</li>\n</ul>\n<p>Preferred qualifications include:</p>\n<ul>\n<li>Experience supporting revenue, sales, or customer lifecycle data</li>\n<li>Familiarity with event-based data platforms (e.g., Data Cloud or equivalents)</li>\n<li>Experience working alongside platform engineering and security teams</li>\n<li>Exposure to data governance, access controls, and compliance considerations</li>\n<li>Experience mentoring or guiding other data practitioners</li>\n</ul>\n<p>The total rewards package at Mercury includes base salary, equity, and benefits. Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_782f7e7f-e0e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mercury","sameAs":"https://www.mercury.com/","logo":"https://logos.yubhub.co/mercury.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mercury/jobs/5806201004","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$142,600 - $198,000","x-skills-required":["SQL","data modeling","modern data stacks","ETL/ELT tooling","Salesforce data models","GTM system architectures"],"x-skills-preferred":["event-based data platforms","data governance","access controls","compliance considerations","mentoring/guiding other data practitioners"],"datePosted":"2026-04-17T12:44:37.206Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, data modeling, modern data stacks, ETL/ELT tooling, Salesforce data models, GTM system architectures, event-based data platforms, data governance, access controls, compliance considerations, mentoring/guiding other data practitioners","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":142600,"maxValue":198000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ed25fe37-186"},"title":"Solutions Engineer","description":"<p>As a Solutions Engineer at Hebbia, you will be a revenue-critical leader who owns the technical strategy in complex, high-scrutiny enterprise evaluations. You will operate at the intersection of commercial leadership and AI execution, shaping structured solution strategies while proving real value during the buying process.</p>\n<p>You will bring clarity to ambiguity. You will translate complex enterprise problems into structured, Hebbia-driven solution approaches that executives can confidently act on. At the same time, you will design tailored AI-powered workflows and evaluations that validate impact, reduce decision risk, and strengthen the commercial narrative.</p>\n<p>You will operate as a trusted advisor to sophisticated buyers navigating ambiguity, risk, and scale. Your job is to clarify thinking, reduce decision risk, and prove value in environments where accuracy, speed, and explainability matter.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Owning technical strategy from discovery through close in complex enterprise sales cycles</li>\n<li>Leading rigorous technical discovery to understand workflows, data realities, constraints and executive success criteria</li>\n<li>Translating ambiguous customer challenges into clear structured Hebbia solution frameworks</li>\n<li>Designing and configuring tailored AI-powered workflows to prove differentiated value during evaluations</li>\n<li>Delivering high-stakes live demos and technical discussions with senior stakeholders</li>\n<li>Surfacing risks early, challenging assumptions, and shaping mitigation strategies that accelerate deal progression</li>\n<li>Partnering closely with Sales to increase win rates, deal velocity, and executive confidence</li>\n<li>Providing structured product feedback that sharpens positioning and informs roadmap decisions</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>5+ years in a customer-facing technical role within complex enterprise sales environments (Solutions Engineering, Sales Engineering, Consulting, Implementation, or similar)</li>\n<li>Deep experience with AI systems, data platforms, LLM-based workflows or decision intelligence platforms</li>\n<li>Demonstrable executive presence, able to earn trust from C-Suite practitioners</li>\n<li>Strong at structuring ambiguous problems into clear, defensible solution approaches</li>\n<li>Strong preference for experience selling to or performing Analyst roles in Banking, Asset Management, Private Markets, Legal or Enterprise Knowledge teams in complex enterprise environments</li>\n<li>Comfortable operating in high-expectation, performance-driven environments</li>\n</ul>\n<p>Compensation includes a 70:30 split between team and individual performance, with a combined base + bonus compensation range of $150,000 - $200,000 + competitive equity.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ed25fe37-186","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Hebbia","sameAs":"https://hebbia.com","logo":"https://logos.yubhub.co/hebbia.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/hebbia/jobs/4680820005","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$150,000 - $200,000","x-skills-required":["AI systems","data platforms","LLM-based workflows","decision intelligence platforms","Solutions Engineering","Sales Engineering","Consulting","Implementation"],"x-skills-preferred":["Banking","Asset Management","Private Markets","Legal","Enterprise Knowledge"],"datePosted":"2026-04-17T12:38:22.911Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, UK"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"AI systems, data platforms, LLM-based workflows, decision intelligence platforms, Solutions Engineering, Sales Engineering, Consulting, Implementation, Banking, Asset Management, Private Markets, Legal, Enterprise Knowledge","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":150000,"maxValue":200000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_77bbde4e-977"},"title":"Solutions Engineer","description":"<p>As a Solutions Engineer at Hebbia, you will be a revenue-critical leader who owns the technical strategy in complex, high-scrutiny enterprise evaluations. You will operate at the intersection of commercial leadership and AI execution, shaping structured solution strategies while proving real value during the buying process.</p>\n<p>You will bring clarity to ambiguity. You will translate complex enterprise problems into structured, Hebbia-driven solution approaches that executives can confidently act on. At the same time, you will design tailored AI-powered workflows and evaluations that validate impact, reduce decision risk, and strengthen the commercial narrative.</p>\n<p>You will operate as a trusted advisor to sophisticated buyers navigating ambiguity, risk, and scale. Your job is to clarify thinking, reduce decision risk, and prove value in environments where accuracy, speed, and explainability matter.</p>\n<p>Key responsibilities include owning technical strategy from discovery through close in complex enterprise sales cycles, leading rigorous technical discovery to understand workflows, data realities, constraints and executive success criteria, translating ambiguous customer challenges into clear structured Hebbia solution frameworks, designing and configuring tailored AI-powered workflows to prove differentiated value during evaluations, delivering high-stakes live demos and technical discussions with senior stakeholders, surfacing risks early, challenging assumptions, and shaping mitigation strategies that accelerate deal progression, partnering closely with Sales to increase win rates, deal velocity, and executive confidence, providing structured product feedback that sharpens positioning and informs roadmap decisions.</p>\n<p>Required skills include 5+ years in a customer-facing technical role within complex enterprise sales environments, deep experience with AI systems, data platforms, LLM-based workflows or decision intelligence platforms, demonstrable executive presence, able to earn trust from C-Suite practitioners, strong at structuring ambiguous problems into clear, defensible solution approaches, strong preference for experience selling to or performing Analyst roles in Banking, Asset Management, Private Markets, Legal or Enterprise Knowledge teams in complex enterprise environments, comfortable operating in high-expectation, performance-driven environments.</p>\n<p>Benefits include unlimited PTO, medical, dental, vision, 401K, wellness benefits, catered lunch daily, doordash dinner credit, parental leave policy, fertility benefits, and a competitive equity package with unmatched upside potential.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_77bbde4e-977","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Hebbia","sameAs":"https://hebbia.com","logo":"https://logos.yubhub.co/hebbia.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/hebbia/jobs/4556367005","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$150,000 - $200,000","x-skills-required":["AI systems","data platforms","LLM-based workflows","decision intelligence platforms","executive presence","structured problem-solving"],"x-skills-preferred":["Selling to Banking, Asset Management, Private Markets, Legal or Enterprise Knowledge teams","Experience with complex enterprise sales environments"],"datePosted":"2026-04-17T12:38:13.494Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"AI systems, data platforms, LLM-based workflows, decision intelligence platforms, executive presence, structured problem-solving, Selling to Banking, Asset Management, Private Markets, Legal or Enterprise Knowledge teams, Experience with complex enterprise sales environments","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":150000,"maxValue":200000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_aebaacf5-640"},"title":"Integrations Engineer","description":"<p>You will own the full lifecycle of integrations that power Hebbia&#39;s AI , from designing connectors to deploying them in production, monitoring their behavior, and debugging failures in real time.</p>\n<p>You&#39;ll work across systems like Snowflake, S3, SharePoint, and internal customer infrastructure , building pipelines that need to handle real-world complexity: unreliable APIs, evolving schemas, massive datasets, and edge cases that don’t show up in documentation.</p>\n<p>This role is hands-on, high-ownership, and deeply technical. You won’t just write code , you’ll develop the instincts to operate and debug complex distributed systems in production.</p>\n<p>You will build connectors and ingestion pipelines that bring enterprise data into Hebbia&#39;s AI platform, from Snowflake warehouses and SharePoint libraries to live pricing feeds, high-velocity news data, and proprietary customer systems.</p>\n<p>You will design and operate pipelines that handle scale, failures, and edge cases gracefully.</p>\n<p>You will debug issues across APIs, auth systems, and data formats, often under real-time customer pressure.</p>\n<p>You will own reliability end-to-end: monitoring, alerting, on-call, and incident response.</p>\n<p>You will improve internal tooling and observability to make systems more robust and easier to operate.</p>\n<p>You will partner with product and customer teams to scope, prioritize, and ship the integrations that unlock Hebbia&#39;s highest-value use cases.</p>\n<p>You will design and ship agents that sit on top of the ingestion layer, making enterprise data accessible and actionable across all of Hebbia&#39;s product surfaces , from document analysis to structured query workflows.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_aebaacf5-640","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Hebbia","sameAs":"https://hebbia.com/","logo":"https://logos.yubhub.co/hebbia.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/hebbia/jobs/4675784005","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$160,000 to $265,000","x-skills-required":["Python","APIs","OAuth flows","webhook patterns","rate limiting","pagination","cloud infrastructure","AWS","Kafka","PostgreSQL","Redis","ElasticSearch"],"x-skills-preferred":["enterprise data platforms","document processing pipelines","content extraction systems","agentic systems","LLM-enabled products","AI tools"],"datePosted":"2026-04-17T12:36:59.611Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City; San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, APIs, OAuth flows, webhook patterns, rate limiting, pagination, cloud infrastructure, AWS, Kafka, PostgreSQL, Redis, ElasticSearch, enterprise data platforms, document processing pipelines, content extraction systems, agentic systems, LLM-enabled products, AI tools","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":160000,"maxValue":265000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e5a00310-3c3"},"title":"Technical Product Manager III","description":"<p>We are seeking a Technical Product Manager to join the Freenome Product Management team. This role will be responsible for planning and coordinating the execution of Freenome&#39;s engineering initiatives for the Commercial Data Platform and Data Products.</p>\n<p>You will create, plan and execute software development plans while ensuring efficiency, scalability and usability for our Commercial, Laboratory, and R&amp;D teams. You will gather requirements and prioritize technical development in alignment with cross-functional teams and stakeholders.</p>\n<p>This role will lead discussions with key-decision makers and stakeholders to create a comprehensive data platform roadmap outlining Freenome&#39;s path to commercialization and support for ongoing product development. You will work closely with our engineering team, leveraging your deep knowledge in Agile methodologies (Scrum, Kanban, sprint planning, etc) to enable seamless execution.</p>\n<p>The role reports to the Associate Director, Technical Product Management. This role will be a Remote role, with onsite travel to our headquarters in Brisbane, California as needed.</p>\n<p>As a Technical Product Manager, you will:</p>\n<ul>\n<li>Define requirements and manage the development processes, systems, and software infrastructure for the data platform and tools supporting Commercial and R&amp;D initiatives.</li>\n<li>Partner with lab and business stakeholders to elucidate requirements, optimize workflows, and lead data platform and product definition, ensuring alignment with regulatory standards (e.g., HIPAA, HITECH).</li>\n<li>Work with engineers to translate product requirements into Epics and Stories with clear user acceptance criteria.</li>\n<li>Track and drive development and ensure fulfillment of requirements using issue tracking system (e.g., Jira).</li>\n<li>Lead and facilitate productive interactions between users and engineers to ensure user needs are understood and met.</li>\n</ul>\n<p>You will create a thorough technical strategy and roadmap to scale Freenome&#39;s commercial data platform, tooling, and data products to support current and future products. Provide a clear and tangible vision for future development efforts with scalability and accessibility in mind.</p>\n<p>Ensure complete data integrity throughout the pipeline with technical and process-oriented solutions.</p>\n<p>Must have:</p>\n<ul>\n<li>Bachelor&#39;s degree in an analytical, technical, or scientific field; advanced degree preferred.</li>\n<li>At least 5 years of experience in a software development environment, ideally within biotechnology and NGS.</li>\n<li>Deep expertise in identification and prioritization of software features.</li>\n<li>Demonstrated ability to successfully lead complex trade-off decisions between functions and measurably improve organizational success and processes.</li>\n<li>Experience leading software development efforts to meet regulated market standards (FDA, CAP/CLIA), laboratory operations, medical affairs, or clinical study operations for diagnostics.</li>\n</ul>\n<p>Nice to have:</p>\n<ul>\n<li>Experience in building solutions with a focus on multiomics, NGS, or proteomics a plus.</li>\n</ul>\n<p>Benefits and additional information:</p>\n<p>The US target range of our base salary for new hires is $118,150 - $167,475. You will also be eligible to receive equity, cash bonuses, and a full range of medical, financial, and other benefits depending on the position offered. Please note that individual total compensation for this position will be determined at the Company&#39;s sole discretion and may vary based on several factors, including but not limited to, location, skill level, years and depth of relevant experience, and education.</p>\n<p>We invite you to check out our career page @ <a href=\"https://freenome.com/job-openings/\">https://freenome.com/job-openings/</a> for additional company information.</p>\n<p>Freenome is proud to be an equal-opportunity employer, and we value diversity. Freenome does not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law.</p>\n<p>Applicants have rights under Federal Employment Laws.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e5a00310-3c3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Freenome","sameAs":"https://freenome.com/","logo":"https://logos.yubhub.co/freenome.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/freenome/jobs/8504755002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$118,150 - $167,475","x-skills-required":["Agile methodologies","Software development","Data platform","Product management","Regulatory standards","HIPAA","HITECH","Jira","User acceptance criteria","Data integrity"],"x-skills-preferred":["Multiomics","NGS","Proteomics"],"datePosted":"2026-04-17T12:35:53.668Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Brisbane, California"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Biotechnology","skills":"Agile methodologies, Software development, Data platform, Product management, Regulatory standards, HIPAA, HITECH, Jira, User acceptance criteria, Data integrity, Multiomics, NGS, Proteomics","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":118150,"maxValue":167475,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5242ca9a-088"},"title":"Staff Automation Engineer","description":"<p>We are looking for a Staff Automation Engineer to have a huge impact on the Business Systems, Security, Production Engineering and IT functions. This role is for a seasoned engineer who thrives on solving complex operational challenges, enhancing system security and stability, and improving efficiency through automation and best practices using AI technologies.</p>\n<p>Your day-to-day will involve implementing Agentic AI and LLM-powered workflows using tools like Tines, AWS Agentcore, AWS Bedrock, Claude Code, etc. You will deploy systems with Infrastructure as Code (IaC) (i.e. Terraform) and build and maintain automation workflows across key enterprise platforms (i.e. Atlassian, Okta, Google Workspace, Slack, Zoom, knowledge management systems), cybersecurity systems (i.e. SIEM, GRC platforms, Data Security Platforms, etc.), and cloud environments (AWS, GCP).</p>\n<p>You will build AI-driven chatbots or intelligent agents that automate tasks, support conversational workflows, and integrate with enterprise applications. You will partner with IT, Security, GRC, Procurement, and business teams to automate operational tasks and processes to reduce toil, improve efficiency and enable business.</p>\n<p>You will develop integrations using REST APIs, JSON, webhooks, and scripting languages (JavaScript, Python). You will follow established automation and AI standards for quality, security, and governance; provide improvements where appropriate.</p>\n<p>You will troubleshoot, maintain, and optimize existing workflows to improve stability and performance. You will document designs, workflows, configurations, and operational procedures.</p>\n<p>You will participate in code reviews, technical discussions, and team-based learning to uplift engineering quality and consistency.</p>\n<p>You will work with various tooling in Security, IT, and Production Engineering.</p>\n<p>This role requires 10+ years of experience in automation engineering, systems integration, or workflow development. You should have experience with automation platforms such as Tines, Retool, Superblocks, n8n, etc. You should also have hands-on experience with Terraform and containerization technologies.</p>\n<p>You should have experience developing LLM-powered automations, conversational interfaces, or Agentic AI assistants. You should have knowledge of Git and modern version control practices.</p>\n<p>You should have strong skills in REST APIs, JSON, webhooks, JavaScript, and Python. You should also have familiarity with identity systems (Okta, SCIM) and RBAC concepts.</p>\n<p>You should have familiarity with cloud environments such as Google Cloud Platform (GCP) and Amazon Web Services (AWS).</p>\n<p>You should be able to break down problems, collaborate cross-functionally, and deliver solutions with moderate guidance.</p>\n<p>You should have strong communication skills and the ability to translate functional requirements into technical outputs.</p>\n<p>Preferred experience includes familiarity with data platform and database technologies (e.g., Snowflake, PostgreSQL, Cassandra, DynamoDB).</p>\n<p>Work perks at Greenlight include medical, dental, vision, and HSA match, paid life insurance, AD&amp;D, and disability benefits, traditional 401k with company match, unlimited PTO, paid company holidays and pop-up bonus holidays, professional development stipends, mental health resources, 1:1 financial planners, fertility healthcare, 100% paid parental and caregiving leave, plus cleaning service and meals during your leave, flexible WFH, both remote and in-office opportunities, fully stocked kitchen, catered lunches, and occasional in-office happy hours, employee resource groups.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5242ca9a-088","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Greenlight","sameAs":"https://www.greenlight.com/","logo":"https://logos.yubhub.co/greenlight.com.png"},"x-apply-url":"https://jobs.lever.co/greenlight/d85a9c34-4434-4f6d-8f01-bccb9521c036","x-work-arrangement":"hybrid","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$180,000-$225,000","x-skills-required":["Agentic AI","LLM-powered workflows","Tines","AWS Agentcore","AWS Bedrock","Claude Code","Infrastructure as Code (IaC)","Terraform","REST APIs","JSON","webhooks","JavaScript","Python","Git","modern version control practices","identity systems","RBAC concepts","cloud environments","Google Cloud Platform (GCP)","Amazon Web Services (AWS)"],"x-skills-preferred":["data platform and database technologies","Snowflake","PostgreSQL","Cassandra","DynamoDB"],"datePosted":"2026-04-17T12:35:33.366Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Agentic AI, LLM-powered workflows, Tines, AWS Agentcore, AWS Bedrock, Claude Code, Infrastructure as Code (IaC), Terraform, REST APIs, JSON, webhooks, JavaScript, Python, Git, modern version control practices, identity systems, RBAC concepts, cloud environments, Google Cloud Platform (GCP), Amazon Web Services (AWS), data platform and database technologies, Snowflake, PostgreSQL, Cassandra, DynamoDB","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":225000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_bbaf1090-aa3"},"title":"Senior Product Manager, Cyngn Insight & Fleet Management","description":"<p>About Cyngn</p>\n<p>Cyngn is a publicly-traded autonomous technology company that deploys self-driving industrial vehicles to factories, warehouses, and other facilities throughout North America. We are looking for a Senior Product Manager to lead the strategy and execution of Cyngn Insight and Fleet Management, the company&#39;s core software platform for operating, monitoring, and scaling autonomous vehicle fleets in production environments.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Own the product vision, strategy, and roadmap for Cyngn Insight and Fleet Management, including fleet operations, monitoring, analytics, safety workflows, and enterprise integrations.</li>\n<li>Define and maintain a multi-quarter roadmap that balances customer needs, operational scalability, system reliability, security, and commercial objectives.</li>\n<li>Act as the senior product owner for fleet and operations initiatives, driving clear prioritization, trade-off decisions, and execution across concurrent programs.</li>\n<li>Partner with Engineering and Program Management to translate platform strategy into actionable requirements, milestones, and delivery plans.</li>\n<li>Translate customer workflows, operational data, and deployment feedback into product requirements that improve fleet uptime, efficiency, safety visibility, and ease of use.</li>\n<li>Collaborate with Sales, Business Development, Marketing, and Customer Success on go-to-market strategy, customer onboarding, and long-term account success.</li>\n<li>Define, track, and communicate success metrics (KPIs) for fleet performance and platform adoption, including utilization, uptime, incident response, and customer value realization.</li>\n</ul>\n<p>Qualifications</p>\n<ul>\n<li>BS/BA in a technical field or equivalent practical experience.</li>\n<li>5+ years of product management experience, including at least 2 years in a senior or lead PM role.</li>\n<li>Demonstrated success delivering complex, multi-tenant software platforms or enterprise products operating at scale.</li>\n<li>Strong technical aptitude with experience working closely with engineering teams on distributed systems, data platforms, or cloud-based products.</li>\n<li>Excellent communication and leadership skills, with comfort engaging executives, customers, and deeply technical stakeholders.</li>\n</ul>\n<p>Preferred Qualifications</p>\n<ul>\n<li>Experience with fleet management systems, enterprise SaaS platforms, or operational analytics products.</li>\n<li>Background in software development, data platforms, or systems engineering (e.g., Python, SQL, cloud infrastructure).</li>\n<li>Familiarity with autonomous vehicles, robotics operations, industrial automation, or large-scale deployments in logistics, manufacturing, or warehousing.</li>\n</ul>\n<p>Benefits &amp; Perks</p>\n<ul>\n<li>Health benefits (Medical, Dental, Vision, HSA and FSA (Health &amp; Dependent Daycare), Employee Assistance Program, 1:1 Health Concierge)</li>\n<li>Life, Short-term and long-term disability insurance (Cyngn funds 100% of premiums)</li>\n<li>Company 401(k)</li>\n<li>Commuter Benefits</li>\n<li>Flexible vacation policy</li>\n<li>Sabbatical leave opportunity after 5 years with the company</li>\n<li>Paid Parental Leave</li>\n<li>Daily lunches for in-office employees and fully-stocked kitchen with snacks and beverages</li>\n<li>Monthly meal and tech allowances for remote employees</li>\n<li>Allowance to purchase new headphones when you join!</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_bbaf1090-aa3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Cyngn","sameAs":"https://www.cyngn.com/","logo":"https://logos.yubhub.co/cyngn.com.png"},"x-apply-url":"https://jobs.lever.co/cyngn/29a77488-5a84-4242-aba3-a5cca4a6681c","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$160,000-180,000 per year","x-skills-required":["product management","software development","data platforms","cloud-based products","fleet management systems","enterprise SaaS platforms","operational analytics products"],"x-skills-preferred":["Python","SQL","cloud infrastructure","autonomous vehicles","robotics operations","industrial automation","large-scale deployments"],"datePosted":"2026-04-17T12:28:00.826Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Mountain View"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"product management, software development, data platforms, cloud-based products, fleet management systems, enterprise SaaS platforms, operational analytics products, Python, SQL, cloud infrastructure, autonomous vehicles, robotics operations, industrial automation, large-scale deployments","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":160000,"maxValue":180000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3d849fbc-058"},"title":"Member of Product, Data Platform","description":"<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto.</p>\n<p>The Data Platform team is the backbone of Anchorage Digital&#39;s information infrastructure. As data becomes the lifeblood of every product, compliance workflow, and client-facing report we produce, this team is responsible for building and operating a unified, scalable, and reliable data platform that serves the entire organization.</p>\n<p>As a Data Platform Product Manager, you will own the strategy and execution for centralizing and formalizing the company&#39;s data infrastructure , spanning internal operational data, transaction and blockchain data, customer data, and external data sources.</p>\n<p>Your mission is to transform a fragmented data landscape into a single source of truth that powers mission-critical reporting, business insights, and downstream product experiences across every team at Anchorage.</p>\n<p>This is a force-multiplier role. Your work will elevate the quality, speed, and reliability of every product and team at the company.</p>\n<p>You will define the standards, build the platform, and create the foundation that enables Anchorage to scale with confidence.</p>\n<p>If you thrive at the intersection of complex data systems, cross-functional influence, and platform thinking, this is your opportunity to have outsized impact at a category-defining company in digital assets.</p>\n<p>Below, we define our Factors of Growth &amp; Impact to help Anchorage Villagers measure their impact and articulate feedback, coaching, and the rich learning that happens while exploring, developing, and mastering capabilities within and beyond the Member of Product, Data Platform role:</p>\n<p><strong>Technical Skills:</strong></p>\n<ul>\n<li>Own the detailed prioritization of the data platform roadmap, balancing foundational infrastructure work, new capabilities, and technical debt.</li>\n<li>Demonstrate deep strategic thinking in shaping the platform roadmap, considering the unique data challenges of digital assets, blockchain protocols, and regulated financial services.</li>\n<li>Deliver complex, cross-functional projects with multiple dependencies across engineering, analytics, compliance, and operations teams.</li>\n<li>Work closely with engineering and data science counterparts to drive product development processes, sprint planning, and architectural decisions.</li>\n<li>Ability to understand and reason about system architecture , including data warehousing, ETL/ELT pipelines, streaming vs. batch processing, and modern data stack components , and communicate clear requirements to engineering.</li>\n<li>Drive comprehensive go-to-market strategy for internal platform adoption, including defining success metrics, tracking KPIs around data quality and platform usage, and iterating based on data-driven insights.</li>\n</ul>\n<p><strong>Complexity and Impact of Work:</strong></p>\n<ul>\n<li>Lead and influence cross-functional teams while maintaining strong stakeholder relationships across the entire organization , from engineering to finance to compliance.</li>\n<li>Exercise independent decision-making and take full ownership of data platform strategy and execution.</li>\n<li>Contribute strategic insights that significantly impact company direction, operational efficiency, and product quality.</li>\n<li>Demonstrate platform leadership that elevates the performance and effectiveness of every team that depends on data.</li>\n</ul>\n<p><strong>Organizational Knowledge:</strong></p>\n<ul>\n<li>Develop deep understanding of Anchorage&#39;s business model, product suite, regulatory environment, and organizational structure.</li>\n<li>Build and maintain strong relationships with stakeholders across all departments to ensure the data platform serves the company&#39;s most critical needs.</li>\n<li>Navigate and improve organizational data practices to enhance efficiency, compliance, and decision-making.</li>\n<li>Drive company objectives through strategic data platform decisions and initiatives.</li>\n</ul>\n<p><strong>Communication and Influence:</strong></p>\n<ul>\n<li>Effectively influence and motivate teams across the organization to adopt platform standards and invest in data quality, even when those teams do not report to you.</li>\n<li>Enable cross-functional collaboration through clear, consistent communication about platform capabilities, timelines, and data governance expectations.</li>\n<li>Act as a thoughtful knowledge partner to senior leadership, translating complex data infrastructure topics into clear business impact.</li>\n<li>Proactively communicate platform goals, status updates, and data health metrics throughout the organization.</li>\n</ul>\n<p><strong>You may be a fit for this role if you:</strong></p>\n<ul>\n<li>5+ years of product management experience, with significant time spent on data platforms, data infrastructure, or data-intensive enterprise products.</li>\n<li>Proven experience building or scaling enterprise data platforms , including data warehousing, data lakes, ETL/ELT pipelines, or modern data stack tooling (e.g., Snowflake, Databricks, dbt, Airflow, Spark).</li>\n<li>Strong understanding of data modeling, data governance, and data quality frameworks.</li>\n<li>Experience working with diverse data types , including transactional data, customer data, financial data, and ideally blockchain or on-chain data.</li>\n<li>Track record of driving cross-functional alignment and adoption for internal platform products where you must influence without direct authority.</li>\n<li>Exceptional written and verbal communication skills, with the ability to convey complex data architecture concepts to both technical and non-technical audiences.</li>\n<li>Your empathy and adaptability not only complement others&#39; working styles but also embody our culture of curiosity, creativity, and shared understanding.</li>\n<li>You self describe as some combination of the following: creative, humble, ambitious, detail oriented, hard working, trustworthy, eager to learn, methodical, action oriented, and tenacious.</li>\n</ul>\n<p><strong>Although not a requirement, bonus points if you have:</strong></p>\n<ul>\n<li>You have hands-on experience with blockchain data indexing, onchain analytics, or crypto-native data infrastructure.</li>\n<li>You have built data platforms that serve both internal analytics consumers and external client-facing products (reports, statements, dashboards).</li>\n<li>You have experience supporting clients with data-related issues or concerns.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3d849fbc-058","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anchorage Digital","sameAs":"https://anchorage.com","logo":"https://logos.yubhub.co/anchorage.com.png"},"x-apply-url":"https://jobs.lever.co/anchorage/0e730f61-a2e4-4152-8277-3f6383cc69a6","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data platforms","data infrastructure","data-intensive enterprise products","data warehousing","data lakes","ETL/ELT pipelines","modern data stack tooling","Snowflake","Databricks","dbt","Airflow","Spark","data modeling","data governance","data quality frameworks","blockchain or on-chain data"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:18:21.529Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data platforms, data infrastructure, data-intensive enterprise products, data warehousing, data lakes, ETL/ELT pipelines, modern data stack tooling, Snowflake, Databricks, dbt, Airflow, Spark, data modeling, data governance, data quality frameworks, blockchain or on-chain data"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_30da0df8-cc9"},"title":"F&S COE Analyst","description":"<p>We are seeking an analytically-driven Data Analyst to join our Finance &amp; Strategy team at Stripe. This role bridges the gap between data science and financial planning, requiring someone who can transform complex business data into actionable financial insights.</p>\n<p>You will build sophisticated dashboards, develop predictive models, and serve as the technical backbone for our FP&amp;A and GTM analytics initiatives. This is a unique opportunity for a data professional with financial acumen to directly influence strategic business decisions in a high-growth fintech environment.</p>\n<p><strong>Financial Data Analytics &amp; Modeling</strong></p>\n<ul>\n<li>Design, build, and maintain financial dashboards for FP&amp;A, Revenue Operations, and GTM teams using Tableau, Power BI, or Looker</li>\n<li>Develop automated financial reporting solutions that reduce manual effort and improve data accuracy</li>\n<li>Create sophisticated data models to support budgeting, forecasting, variance analysis, and scenario planning</li>\n<li>Build predictive models for revenue forecasting, customer lifetime value, churn analysis, and unit economics</li>\n</ul>\n<p><strong>Business Intelligence &amp; Reporting</strong></p>\n<ul>\n<li>Partner with Finance Business Partners and FP&amp;A teams to translate business requirements into technical solutions</li>\n<li>Design and implement data infrastructure for financial planning cycles (monthly/quarterly reviews, annual budgets, long-range planning)</li>\n<li>Develop self-service analytics capabilities enabling finance teams to access real-time business insights</li>\n<li>Create executive dashboards tracking key financial and operational metrics (ARR, bookings, retention, CAC, LTV)</li>\n</ul>\n<p><strong>Data Engineering &amp; Analytics Infrastructure</strong></p>\n<ul>\n<li>Write complex SQL queries to extract, transform, and analyze large datasets from multiple source systems</li>\n<li>Build ETL pipelines to integrate financial data from ERP, CRM, billing, and data warehouse systems</li>\n<li>Ensure data quality, consistency, and governance across financial reporting systems</li>\n<li>Optimize database performance and data architecture for scalability</li>\n</ul>\n<p><strong>Strategic Analysis &amp; Insights</strong></p>\n<ul>\n<li>Conduct deep-dive analyses on business performance, identifying trends, anomalies, and opportunities</li>\n<li>Support strategic initiatives through ad-hoc financial modeling and what-if scenario analysis</li>\n<li>Translate complex data findings into clear, actionable recommendations for leadership</li>\n<li>Collaborate with Data Science teams to develop advanced analytics and ML models for finance use cases</li>\n</ul>\n<p><strong>Required Qualifications</strong></p>\n<ul>\n<li>Advanced SQL proficiency (complex joins, window functions, CTEs, query optimization)</li>\n<li>Expert-level experience with at least one BI tool (Tableau, Power BI, Looker, or Qlik)</li>\n<li>Advanced Excel/Google Sheets skills (pivot tables, complex formulas, data modeling)</li>\n<li>Python or R for data analysis, automation, and statistical modeling</li>\n<li>Cloud data platforms (Snowflake, BigQuery, Redshift, Databricks)</li>\n<li>ETL tools (dbt, Airflow, Fivetran) and version control (Git)</li>\n</ul>\n<p><strong>Financial &amp; Business Acumen</strong></p>\n<ul>\n<li>Experience in data analytics within finance, FP&amp;A, or revenue operations functions</li>\n<li>Strong understanding of financial statements (P&amp;L, balance sheet, cash flow)</li>\n<li>Knowledge of key financial metrics: ARR, MRR, bookings, revenue recognition, CAC, LTV, gross margin, EBITDA</li>\n<li>Experience with financial planning processes: budgeting, forecasting, variance analysis, scenario modeling</li>\n<li>Understanding of SaaS/subscription business models and revenue recognition principles (ASC 606 preferred)</li>\n</ul>\n<p><strong>Analytical &amp; Problem-Solving</strong></p>\n<ul>\n<li>Proven ability to work with large, complex datasets and derive meaningful insights</li>\n<li>Experience building financial models and dashboards that drive executive decision-making</li>\n<li>Strong statistical analysis skills and understanding of data visualization best practices</li>\n<li>Track record of translating ambiguous business problems into structured analytical frameworks</li>\n</ul>\n<p><strong>Preferred Experience</strong></p>\n<ul>\n<li>Background in fintech, payments, B2B SaaS, or high-growth technology companies</li>\n<li>Experience supporting GTM analytics (sales forecasting, pipeline analysis, quota setting)</li>\n<li>Familiarity with finance systems: NetSuite, Anaplan, Adaptive Planning, Salesforce, Stripe Billing</li>\n<li>Exposure to data science methodologies and machine learning concepts</li>\n<li>Previous work in cross-functional environments collaborating with finance, data science, and business teams</li>\n</ul>\n<p><strong>Key Competencies</strong></p>\n<ul>\n<li>Business Acumen: Ability to understand complex business models and translate them into data requirements</li>\n<li>Technical Excellence: Deep technical skills with commitment to code quality and best practices</li>\n<li>Communication: Exceptional ability to explain technical concepts to non-technical stakeholders</li>\n<li>Stakeholder Management: Experience partnering with senior leaders and influencing through data</li>\n<li>Ownership Mindset: Self-directed with ability to manage multiple priorities and drive projects to completion</li>\n<li>Continuous Learning: Curiosity to learn new tools, techniques, and business domains</li>\n<li>Attention to Detail: Commitment to data accuracy and quality in high-stakes financial reporting</li>\n</ul>\n<p><strong>Education</strong></p>\n<ul>\n<li>Bachelor&#39;s degree in Finance, Economics, Statistics, Mathematics, Computer Science, Engineering, or related quantitative field</li>\n<li>Advanced degree (MBA, MS in Analytics/Data Science) or relevant certifications (CFA, CPA, data analytics certifications) a plus</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_30da0df8-cc9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Stripe","sameAs":"https://stripe.com/","logo":"https://logos.yubhub.co/stripe.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/stripe/jobs/7597624","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Tableau","Power BI","Looker","Python","R","Cloud data platforms","ETL tools","Version control"],"x-skills-preferred":["Machine learning","Data science","Finance systems","Data visualization"],"datePosted":"2026-03-31T18:15:28.979Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"SQL, Tableau, Power BI, Looker, Python, R, Cloud data platforms, ETL tools, Version control, Machine learning, Data science, Finance systems, Data visualization"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_76107476-db4"},"title":"Senior Data Scientist, Gemini App, Google DeepMind","description":"<p>We are seeking a Senior Data Scientist to join our GeminiApp team in Zurich, Switzerland. As a key partner and co-creator in our product strategy, you will be instrumental in building a uniquely proactive and powerful assistant by ensuring our strategic decisions are grounded in data.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Partner with Verticals PM, engineering, and UX to develop data-driven product strategies</li>\n<li>Translate ambiguous questions into well-defined problems, design experiments, and analyse large complex datasets for insights</li>\n<li>Develop and implement novel, goal-oriented metrics</li>\n<li>Build and deploy statistical/ML models to understand our users, enhance product capabilities and personalise user experience</li>\n<li>Communicate findings &amp; recommendations to stakeholders, including executives</li>\n<li>Champion data-driven culture by feeding user engagement insights back into models</li>\n<li>Collaborate with the GenAI team on model quality and feature adoption</li>\n<li>Act as a technical leader for a global team, guiding junior members on complex analyses and upholding best practices to ensure high-quality, impactful work</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s degree in Statistics, Mathematics, Data Science, Engineering, Physics, Economics, or a related quantitative field</li>\n<li>5 years of experience with analysis applications (e.g., extracting insights, performing statistical analysis, or solving business problems), and coding (e.g., Python, R, SQL) or 2 years of experience with a Master&#39;s degree</li>\n<li>2 years of work experience identifying opportunities for business/product improvement and then defining/measuring the success of those initiatives</li>\n<li>A bias for action and a relentless drive to build something great</li>\n<li>Strong business acumen and a strategic mindset</li>\n<li>Deep technical expertise</li>\n<li>Exceptional communication and presentation skills</li>\n<li>A collaborative and influential partner</li>\n<li>A commitment to user trust and privacy</li>\n</ul>\n<p>Preferred qualifications include a Master&#39;s degree in a relevant field and experience working with machine learning and statistical modelling tools.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_76107476-db4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Google DeepMind","sameAs":"https://deepmind.com/","logo":"https://logos.yubhub.co/deepmind.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/deepmind/jobs/7560458","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","R","SQL","Machine Learning","Statistical Modelling","Data Analysis","Data Science","Business Acumen","Strategic Thinking"],"x-skills-preferred":["Experience with deep learning frameworks","Knowledge of cloud-based data platforms","Familiarity with agile development methodologies"],"datePosted":"2026-03-16T14:42:46.350Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Zurich, Switzerland"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, R, SQL, Machine Learning, Statistical Modelling, Data Analysis, Data Science, Business Acumen, Strategic Thinking, Experience with deep learning frameworks, Knowledge of cloud-based data platforms, Familiarity with agile development methodologies"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e6811a48-f96"},"title":"VP, Partner Engineering","description":"<p><strong>VP, Partner Engineering at Quantexa</strong></p>\n<p><strong>What we&#39;re all about.</strong></p>\n<p>Our DATA values—Determination, Ambition, Teamwork, Accountability—aren&#39;t just operating principles, they&#39;re our competitive advantage. We bring context to every decision with a category-defining Decision Intelligence platform, and our partner ecosystem is the multiplier.</p>\n<p>As VP, Partner Engineering, you will lead the technical vision, execution, and team required to scale partner-built and partner-delivered solutions across the world&#39;s most strategic technology and consulting landscapes.</p>\n<p><strong>Make Every Decision Count with Context</strong></p>\n<p><strong>The opportunity.</strong></p>\n<p>Reporting to the SVP Global Alliances, the VP, Partner Engineering is the executive accountable for setting the Partner Engineering strategy to transform our partner ecosystem into a best-in-class technical delivery engine that expands consumption, accelerates CoSell and Channel revenue, and drives production deployments with repeatability and governance discipline.</p>\n<p>This role will include:</p>\n<ul>\n<li>Developing, executing and leading the Partner Engineering strategy to deliver non-linear revenue growth via the Ecosystem</li>\n<li>Technical leadership of partner architecture</li>\n<li>Management of Partner Solution Architects as Player Coach i.e. Lead by example</li>\n<li>Execution of joint solution engineering with hyperscalers, industry ISV’s &amp; GSIs</li>\n<li>Executive engagement and business impact through ecosystem consumption &amp; marketplace ARR</li>\n</ul>\n<p><strong>What you&#39;ll be doing.</strong></p>\n<ul>\n<li>Set the global Partner Engineering strategy and architecture standards aligned to cloud marketplaces, consumption incentives, and AI/data collaborations.</li>\n<li>Lead co-innovation blueprints and quick starts integrating our platform with the likes of Microsoft Azure, GCP, AWS, Databricks, Guidewire style decision systems, and GSI solution frameworks.</li>\n<li>Personally sponsor and govern the reference architecture, connector strategy, AI/data interop strategy, and workload optimization patterns for partner solutions.</li>\n<li>Create enterprise-grade integration frameworks and enforce architectural guardrails that accelerate deployment while minimizing partner/customer delivery risk.</li>\n<li>Partner with Product and R&amp;D for cross function collaboration and alignment around roadmap and innovation</li>\n<li>Lead, mentor, and scale a global team of Partner Solution Architects focused on:</li>\n</ul>\n<ul>\n<li>Joint solution architecture</li>\n<li>Partner-led PoCs/PoVs</li>\n<li>Demo and sandbox platforms</li>\n<li>Deployment acceleration</li>\n<li>Technical certifications and readiness</li>\n</ul>\n<ul>\n<li>Build an elite Partner Architecture Centre of Excellence that rivals top ecosystem engineering orgs in the industry.</li>\n<li>Define clear role expectations, capability uplift plans, and partner field alignment models for PSAs.</li>\n<li>Establish performance-driven culture anchored in partner adoption velocity, PoV win rate, demo adoption, deployment acceleration, and marketplace ARR.</li>\n<li>Partner with Field Alliances, Field Engineering and RVP’s to enable and ensure Ecosystem attribution</li>\n<li>Own hyperscaler and ISV technical partnership co-innovation roadmaps, ensuring:</li>\n</ul>\n<ul>\n<li>Marketplace-ready joint solutions</li>\n<li>Consumption and integration patterns tied to incentives</li>\n<li>Field delivery kits for PSA + partner sellers</li>\n<li>Hardened reference designs for repeatable pursuits</li>\n</ul>\n<ul>\n<li>Drive GSI partnerships into factory-style delivery motions, launching industry solutions and field-ready plays with global systems integrators.</li>\n<li>Govern multi-party architecture risk, security, deployment patterns, rapid time-to-value, and executive sponsor checkpoints.</li>\n<li>Be the Evangelist for Ecosystem opportunities and own the hypothesis qualification for embedded or OEM, co-developed offerings and new pursuits</li>\n<li>Working closely with Product, define our ecosystem’s technical differentiation narrative—anchored in decision context, AI/data intelligence, workflows, and enterprise integration depth—against point solutions and platform incumbents.</li>\n<li>Ensure PSAs and partners can articulate why our platform wins architecturally, commercially, and operationally.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<p><strong>What you&#39;ll bring.</strong></p>\n<ul>\n<li>Proven executive leadership building partner engineering teams within enterprise software ecosystems</li>\n<li>Deep technical fluency in data platforms, AI/ML, analytics, cloud integration patterns, connectors, workflow systems and marketplace consumption architectures.</li>\n<li>Experience launching production-hardened joint solutions with hyperscalers and GSIs with measurable revenue and deployment outcomes.</li>\n<li>Ability to translate platform capability into partner-owned architectures that scale across customer estates.</li>\n<li>Practitioner’s mindset for governance of PoVs, partner certifications, deployment velocity, risk mitigation, and technical adoption KPIs</li>\n<li>Extensive experience in senior partner engineering, sales engineering or ecosystem technical leadership within large-scale enterprise software or SaaS companies.</li>\n<li>Excellent experience of managing architect/engineering-focused partner teams, PSAs, and global technical enablement functions.</li>\n<li>Demonstrated success driving multi-cloud partner consumption ARR, joint solution launches, and co-sell acceleration.</li>\n<li>Executive communication credibility with partner CTO/CIO/VP engineering stakeholders and hyperscaler/GSI technical leaders.</li>\n<li>Comfort operating across legal, sales, product, security, presales, and partner delivery functions.</li>\n<li>Willingness to travel globally (&gt;40%).</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p><strong>Our perks and quirks.</strong></p>\n<p>What makes you Q will help you to realize your full potential, flourish and enjoy what you do, while being recognized and rewarded with our broad range of benefits.</p>\n<p>We know that just having an excellent glass door rating isn’t enough, so we’ve put together a competitive package as a way of saying thank you for all your hard work and dedication.</p>\n<p>We offer:</p>\n<p>Competitive salary Company bonus Flexible working hours in a hybrid workplace &amp; free access to global WeWork locations &amp; events Pension Scheme with a company contribution of 6% (if you contribute 3%) 25 days annual leave (with the option to buy up to 5 days) + birthday off! Work from Anywhere Scheme: Spend up to 2 months working outside of your country of employment over a rolling 12-month period Family: Enhanced Maternity, Paternity, Adoption, or Shared Parental Leave Health &amp; Wellbeing: Private Healthcare, EAP, Well-being Days, Calm App, Gym Discounts Team&#39;s Social Budget &amp; Company-wide Summer &amp; Winter Parties Tech &amp; Cycle-to-Work Schemes Volunteer Day off Dog-friendly Offices</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e6811a48-f96","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Quantexa","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/899Vt1BX63R4TGffvrdv1a/hybrid-vp%2C-partner-engineering-in-london-at-quantexa","x-work-arrangement":"hybrid","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data platforms","AI/ML","analytics","cloud integration patterns","connectors","workflow systems","marketplace consumption architectures","partner engineering","partner architecture","joint solution engineering","hyperscalers","industry ISV’s","GSIs","executive engagement","ecosystem consumption","marketplace ARR","global team management","partner solution architects","joint solution architecture","partner-led PoCs/PoVs","demo and sandbox platforms","deployment acceleration","technical certifications","performance-driven culture","partner adoption velocity","PoV win rate","demo adoption","deployment acceleration","marketplace ARR","GSI partnerships","factory-style delivery motions","industry solutions","field-ready plays","global systems integrators","multi-party architecture risk","security","deployment patterns","rapid time-to-value","executive sponsor checkpoints","ecosystem opportunities","embedded or OEM","co-developed offerings","new pursuits","technical differentiation narrative","decision context","AI/data intelligence","workflows","enterprise integration depth","point solutions","platform incumbents","practitioner’s mindset","governance of PoVs","partner certifications","deployment velocity","risk mitigation","technical adoption KPIs","senior partner engineering","sales engineering","ecosystem technical leadership","large-scale enterprise software","SaaS companies","architect/engineering-focused partner teams","PSAs","global technical enablement functions","multi-cloud partner consumption ARR","joint solution launches","co-sell acceleration","executive communication credibility","partner CTO/CIO/VP engineering stakeholders","hyperscaler/GSI technical leaders","comfort operating across legal","sales","product","security","presales","partner delivery functions","willingness to travel globally"],"x-skills-preferred":[],"datePosted":"2026-03-09T17:05:42.899Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data platforms, AI/ML, analytics, cloud integration patterns, connectors, workflow systems, marketplace consumption architectures, partner engineering, partner architecture, joint solution engineering, hyperscalers, industry ISV’s, GSIs, executive engagement, ecosystem consumption, marketplace ARR, global team management, partner solution architects, joint solution architecture, partner-led PoCs/PoVs, demo and sandbox platforms, deployment acceleration, technical certifications, performance-driven culture, partner adoption velocity, PoV win rate, demo adoption, deployment acceleration, marketplace ARR, GSI partnerships, factory-style delivery motions, industry solutions, field-ready plays, global systems integrators, multi-party architecture risk, security, deployment patterns, rapid time-to-value, executive sponsor checkpoints, ecosystem opportunities, embedded or OEM, co-developed offerings, new pursuits, technical differentiation narrative, decision context, AI/data intelligence, workflows, enterprise integration depth, point solutions, platform incumbents, practitioner’s mindset, governance of PoVs, partner certifications, deployment velocity, risk mitigation, technical adoption KPIs, senior partner engineering, sales engineering, ecosystem technical leadership, large-scale enterprise software, SaaS companies, architect/engineering-focused partner teams, PSAs, global technical enablement functions, multi-cloud partner consumption ARR, joint solution launches, co-sell acceleration, executive communication credibility, partner CTO/CIO/VP engineering stakeholders, hyperscaler/GSI technical leaders, comfort operating across legal, sales, product, security, presales, partner delivery functions, willingness to travel globally"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7af16166-8fd"},"title":"FBS Senior Data Domain Architect","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p><strong>What to expect on your journey with us:</strong></p>\n<ul>\n<li>A solid and innovative company with a strong market presence</li>\n<li>A dynamic, diverse, and multicultural work environment</li>\n<li>Leaders with deep market knowledge and strategic vision</li>\n<li>Continuous learning and development</li>\n</ul>\n<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>\n<p><strong>Key Responsibilities:</strong></p>\n<ul>\n<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>\n<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>\n<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>\n<li>Reviews high level design to ensure alignment to Solution Architecture.</li>\n<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>\n<li>Mentor developers and create reference implementations/frameworks.</li>\n<li>Partners with System Architects to elaborate capabilities and features.</li>\n<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>Over 6 years of experience as a senior domain architect for Data domains</li>\n<li>Advanced English Level</li>\n<li>Masters&#39; degree (PLUS)</li>\n<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>\n</ul>\n<p><strong>Technical &amp; Business Skills:</strong></p>\n<ul>\n<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>\n<li>Data Architecture / Data Modeling – Advanced (MUST)</li>\n<li>Data Warehouse – Advanced (MUST)</li>\n<li>Cloud Data Platforms - Advanced</li>\n<li>Data Integration Tools – Advanced</li>\n<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>\n<li>Any Cloud - Intermediate (4-6 Years)</li>\n<li>Power BI or Tableau - Intermediate (4-6 Years)</li>\n<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>\n<li>Data Lakehouse – Intermediate (MUST)</li>\n</ul>\n<ul>\n<li>Data Governance - Intermediate</li>\n<li>AI/ML - Entry Level (PLUS)</li>\n<li>Master Data Management - Intermediate</li>\n<li>Operational Data Management - Intermediate</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7af16166-8fd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/jdUFHSPZZjHsgd3TR4R3BS/remote-fbs-senior-data-domain-architect-in-colombia-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["ETL/ELT Tools (Informatica, DBT)","Data Architecture / Data Modeling","Data Warehouse","Cloud Data Platforms","Data Integration Tools","Snowflake or Databricks","Any Cloud","Power BI or Tableau","Data Science tools (Sagemaker, Databricks)","Data Lakehouse"],"x-skills-preferred":["Data Governance","AI/ML","Master Data Management","Operational Data Management"],"datePosted":"2026-03-09T17:00:36.230Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, AI/ML, Master Data Management, Operational Data Management"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_592c119a-bbe"},"title":"FSI Sales Director - Japan","description":"<p><strong>FSI Sales Director - Japan</strong></p>\n<p><strong>What We&#39;re Looking For</strong></p>\n<p>We&#39;re looking for a highly motivated sales professional to drive growth in financial institutions across Japan. As a FSI Sales Director, you will be responsible for executing the sales go-to-market strategy, identifying new opportunities, building our pipeline, winning new deals, and meeting company targets.</p>\n<p><strong>What You&#39;ll Be Doing</strong></p>\n<ul>\n<li>Define and execute the Japan Financial Services go-to-market strategy in partnership with the Head of Sales Asia &amp; Japan, SME teams, and APAC Alliances Director.</li>\n<li>New logo acquisition across Japan&#39;s mega banks and insurance institutions in a high-impact hunting role.</li>\n<li>Consistently achieve revenue targets while expanding footprint within strategic accounts.</li>\n<li>Build and progress a high-quality pipeline through disciplined prospecting and structured account planning.</li>\n<li>Engage C-suite, business, data, and technology leaders, positioning Quantexa as a strategic enterprise partner.</li>\n<li>Orchestrate complex, multi-stakeholder enterprise sales cycles, driving opportunities to close.</li>\n<li>Collaborate cross-functionally with senior leadership and Solution Engineering to shape compelling technical and commercial outcomes.</li>\n<li>Execute aligned account strategies that convert executive engagement into tangible commercial results.</li>\n<li>Tokyo-based role focused on Japan&#39;s Financial Services sector.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>A deep understanding of the Japan financial services landscape</li>\n<li>Experience selling complex, multi-stakeholder, multi-month enterprise solutions</li>\n<li>Expertise working and solution selling within Japan mega banks and Insurance</li>\n<li>Ambitious and energetic with strong inter-personal skills</li>\n<li>Good team player, capable of delivering results in less than perfect circumstances</li>\n<li>Emotionally intelligent and good reader of people</li>\n<li>Fluent Japanese and strong business-level English required as a minimum</li>\n<li>Experience of working across multi-national cross functional teams, such as pre-sales, product, marketing, customer success etc...</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive base salary</li>\n<li>Company bonus</li>\n<li>Free Calm App Subscription</li>\n<li>Annual leave, national holidays and your birthday off!</li>\n<li>On-going personal development</li>\n<li>Country specific benefits</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_592c119a-bbe","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Quantexa","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/nYUmfvxNtu6vKseaT3HG16/hybrid-fsi-sales-director---japan-in-tokyo-at-quantexa","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Sales","Enterprise Sales","Complex Sales","Financial Services","Japan Financial Services","Mega Banks","Insurance Institutions"],"x-skills-preferred":["Advanced Data Analytics","Big Data","AI/ML","Financial crime/KYC and Fraud solutions","Data Management","Data Platforms"],"datePosted":"2026-03-09T17:00:04.070Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Tokyo"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Finance","skills":"Sales, Enterprise Sales, Complex Sales, Financial Services, Japan Financial Services, Mega Banks, Insurance Institutions, Advanced Data Analytics, Big Data, AI/ML, Financial crime/KYC and Fraud solutions, Data Management, Data Platforms"}]}