<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>52a09fd7-0a6</externalid>
      <Title>Trade Floor Support Engineer</Title>
      <Description><![CDATA[<p>We are seeking a highly skilled Trade Floor Support Engineer to join our team. The successful candidate will be responsible for delivering high-quality technical support to end-users in a fast-paced trading environment.</p>
<p>Key responsibilities include serving as the primary interface with trading business units, ensuring seamless technology operations, and proactively addressing issues. The ideal candidate will thrive in a collaborative environment, demonstrate a strong sense of ownership, and be passionate about providing world-class support to drive business success.</p>
<p>The role requires a minimum of 5+ years of progressive technical support experience in an enterprise-level environment, preferably within the financial industry. The ideal candidate will have strong knowledge of Active Directory and Exchange, familiarity with ITIL frameworks and tools such as ServiceNow, and proven experience with market data platforms and vendor integrations.</p>
<p>In addition to the above requirements, the successful candidate will be able to prioritize and perform under pressure in a fast-moving, constantly changing environment. They will also be able to maintain and expand technical knowledge through continuous learning, with a focus on delivering exceptional customer support.</p>
<p>This is an exciting opportunity to work at the heart of a fast-paced trading environment, where technology and business intersect. You will have the chance to collaborate with top-tier professionals, tackle complex challenges, and make a direct impact on the success of our trading operations.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$175,000 to $250,000</Salaryrange>
      <Skills>Active Directory, Exchange, ServiceNow, Market data platforms, Vendor integrations, Windows environments, Microsoft Windows OS, Microsoft Office Suite, Mobile devices, VDI and Citrix environments, Basic network and telecommunications connectivity</Skills>
      <Category>IT</Category>
      <Industry>Finance</Industry>
      <Employername>IT Infrastructure</Employername>
      <Employerlogo>https://logos.yubhub.co/mlp.eightfold.ai.png</Employerlogo>
      <Employerdescription>IT Infrastructure provides IT services to various organisations. It operates globally.</Employerdescription>
      <Employerwebsite>https://mlp.eightfold.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://mlp.eightfold.ai/careers/job/755955158840</Applyto>
      <Location>New York, New York, United States of America</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1d84bf66-726</externalid>
      <Title>Senior Global Medical Affairs Leader</Title>
      <Description><![CDATA[<p>Location: Gaithersburg Hybrid: 3 days a week on site Reporting to the Medical Head of Emerging Medicine, the Senior Global Medical Affairs Leader is an experienced medical affairs leader with a strong record of vision and impact. This role designs and drives bold global medical strategies within the CVRM Emerging Medicine therapy area, steering cross-functional and cross-regional teams to deliver ambitious outcomes. Acting as a key enterprise leader, the Senior GMAL shapes global medical agendas through leadership of Global Medical Teams, Global Product Teams and Disease Area Teams, while partnering closely with Medical and Commercial colleagues across markets and regions. This position champions innovative, disruptive thinking, identifies breakthrough opportunities, reimagines market dynamics, and accelerates high-impact evidence generation and medical change initiatives. It calls for someone ready to navigate and transform health systems, harness digital and advanced analytics, drive stakeholder engagement, and ensure AstraZeneca’s science leads both externally and internally through world-class evidence and strategic partnerships. Are you ready to set a new standard for what medical affairs can achieve?</p>
<p>Accountabilities The Senior GMAL leads the design and execution of global product and disease area strategies, translating scientific insight and healthcare system understanding into clear, practice-changing medical plans. This includes shaping ambitious practice change initiatives that influence care pathways, market access, and treatment adoption across diverse geographies. The role is responsible for the global evidence generation strategy, from interventional trials (Phases 2–4) to real-world evidence programs, ensuring that clinical research, registries and RWE studies are aligned with brand, access and healthcare transformation objectives. It drives optimisation and communication of data across scientific platforms, ensuring differentiated evidence reaches regulators, payers and healthcare professionals in a timely and impactful way. A core responsibility is to blend advanced scientific expertise with digital skill, using digital health tools, analytics and data platforms to transform how evidence is communicated and how stakeholders are engaged. The Senior GMAL crafts compelling scientific narratives, leads digital dissemination of evidence, and safeguards accuracy, impact and compliance in all scientific communications. The role builds and sustains influential relationships across the healthcare ecosystem, including global and regional KOLs, professional societies, regulators, payers, patient organisations, digital innovators and health systems. It mobilises support for medical strategy and healthcare change initiatives, while mentoring cross-functional teams and encouraging scientific leadership and collaboration. The Senior GMAL also brings deep regulatory insight to the design of clinical trials and evidence programs, anticipating evolving regulatory and payer requirements to ensure robust, compliant data generation and communication. This includes guiding teams on regulatory expectations, shaping evidence packages for submissions and HTA processes, and ensuring that medical plans are fully integrated with commercial, access and lifecycle strategies. Throughout, the role acts with urgency, challenges the status quo, spots new opportunities in data and science, and leads teams through complex decision-making to deliver significant impact for patients worldwide.</p>
<p>Essential Skills/Experience</p>
<ul>
<li>Master’s degree in any field with at least 5 years of experience in the pharmaceutical industry (Medical, Marketing, R&amp;D, Market Access), or extensive experience within a health system (clinical, pharmacy, pathway design, etc.)</li>
<li>Demonstrated leadership in complex, global healthcare environments, including leading franchise/TA business in key markets, driving brand performance, and leading large, diverse teams.</li>
<li>Significant expertise and strategic insight into healthcare system pathways, regulatory landscapes, market-shaping, and external stakeholder engagement.</li>
<li>Consistent track record of internal and external network building, collaboration across geographies and matrix teams, and effective mentoring of junior colleagues or peers.</li>
<li>Digital proficiency, maximising digital health tools and data platforms to advance evidence generation and communication.</li>
<li>Deep understanding of patient care pathways, evolving treatment protocols, scientific literature, and contemporary approaches to healthcare transformation.</li>
<li>Shown ability to guide, influence, and align diverse stakeholders,including thought leaders, regulatory bodies, patient groups, and payer organisations.</li>
</ul>
<p>Desirable Skills/Experience</p>
<ul>
<li>Doctoral-level degree (MD, PhD, PharmD) in a relevant field.</li>
<li>Recognised expertise in the relevant or related therapeutic area, often evidenced by a history of clinical practice, research leadership, or significant external engagement.</li>
<li>Advanced experience designing, leading, and interpreting clinical research programs, including pivotal clinical trials (Ph2–Ph4), practice-changing RWE studies, and implementation of innovative evidence generation approaches.</li>
<li>Broad experience beyond Medical Affairs; for example, prior roles in Marketing, R&amp;D, or Market Access.</li>
<li>Long-standing relationships with global KOLs, societies, regulators, digital health partners, and innovative healthcare organisations.</li>
<li>Track record of launching new products or indications in multiple, diverse markets (e.g., US, China, EU), and successfully developing and delivering global medical strategies.</li>
</ul>
<p>AstraZeneca offers an environment built on innovation and curiosity where difference is valued and questioning minds are encouraged to challenge convention; teams reflect the diversity of the communities they serve, work closely together across disciplines and geographies, embrace digital and data to solve complex challenges at pace, learn continuously through shared insight and feedback, and channel their entrepreneurial spirit into developing the next generation of therapeutics that improve outcomes for patients and society. If this role matches your experience and ambitions, apply now to join us on this journey! Are you ready to bring insights and fresh thinking to the table? Brilliant! We have one seat available, and we hope it’s yours. Apply today.</p>
<p>AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We follow all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.</p>
<p>The annual base pay for this position ranges from $243,586.40 - $336,379.60 USD , Hourly and salaried non-exempt employees will also be paid overtime pay when working qualifying overtime hours. Base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. In addition, our positions offer a short-term incentive bonus opportunity; eligibility to participate in our equity-based long-term incentive program (salaried roles), to receive a retirement contribution (hourly roles), and commission payment eligibility (sales roles). Benefits offered included a qualified retirement program [401(k) plan]; paid vacation and holidays; paid leaves; and, health benefits.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$243,586.40 - $336,379.60 USD</Salaryrange>
      <Skills>Master’s degree in any field, 5 years of experience in the pharmaceutical industry, Leadership in complex, global healthcare environments, Significant expertise and strategic insight into healthcare system pathways, Digital proficiency, maximising digital health tools and data platforms, Doctoral-level degree (MD, PhD, PharmD), Recognised expertise in the relevant or related therapeutic area, Advanced experience designing, leading, and interpreting clinical research programs, Broad experience beyond Medical Affairs, Long-standing relationships with global KOLs, societies, regulators, digital health partners, and innovative healthcare organisations</Skills>
      <Category>Medical</Category>
      <Industry>Healthcare</Industry>
      <Employername>Global BPM CVRM Emerging Medicines</Employername>
      <Employerlogo>https://logos.yubhub.co/astrazeneca.eightfold.ai.png</Employerlogo>
      <Employerdescription>AstraZeneca is a multinational biopharmaceutical company that develops and commercializes prescription medicines and vaccines for diseases such as cancer, cardiovascular, gastrointestinal, infection, neuroscience, oncology, and respiratory.</Employerdescription>
      <Employerwebsite>https://astrazeneca.eightfold.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://astrazeneca.eightfold.ai/careers/job/563877689800017</Applyto>
      <Location>Gaithersburg, Maryland, United States of America</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>a3618001-1b4</externalid>
      <Title>GTM Architect</Title>
      <Description><![CDATA[<p>The GTM Architect, Enterprise will report to the Head of Enterprise GTM and will own the operating systems, measurement, and engagement model that power Scale AI&#39;s enterprise revenue motion.</p>
<p>This role is responsible for designing and running the enterprise GTM architecture across RevOps, sales performance, and cross-functional execution, ensuring Scale&#39;s largest and most strategic accounts are engaged with rigor, consistency, and impact.</p>
<p>The GTM Architect will bring a strong point of view on how Scale engages enterprise accounts, how performance is measured, and how teams operate day-to-day to drive predictable growth. Over time, this role will have the opportunity to build and lead a GTM / RevOps team.</p>
<p>Responsibilities:</p>
<ul>
<li>Own and evolve Scale&#39;s enterprise GTM operating model, including account engagement strategy, funnel design, and sales execution standards</li>
</ul>
<ul>
<li>Design and manage the enterprise RevOps systems stack (e.g., CRM, forecasting, reporting, planning tools) to support scalable growth</li>
</ul>
<ul>
<li>Define, track, and operationalize core enterprise GTM metrics across pipeline health, deal velocity, forecast accuracy, and rep productivity</li>
</ul>
<ul>
<li>Establish and run sales operating cadences including pipeline reviews, forecast calls, QBRs, and performance reviews</li>
</ul>
<ul>
<li>Partner with Sales Leadership and Finance to design, implement, and maintain enterprise sales compensation plans, including quota governance and attainment reporting</li>
</ul>
<ul>
<li>Build executive-level dashboards and reporting to support GTM decision-making and leadership visibility</li>
</ul>
<ul>
<li>Serve as a strategic thought partner to Enterprise Sales leaders, bringing a strong opinion on how accounts should be covered, prioritized, and engaged</li>
</ul>
<ul>
<li>Act as the connective tissue across Sales, Marketing, Finance, Product, and Solutions Engineering for enterprise GTM planning and execution</li>
</ul>
<ul>
<li>Support annual and quarterly planning efforts including territory design, capacity modeling, headcount planning, and quota setting</li>
</ul>
<ul>
<li>Ensure data integrity, process clarity, and operational discipline across all enterprise GTM motions</li>
</ul>
<p>Ideally, You Will Have:</p>
<ul>
<li>8–12+ years of experience in RevOps, Sales Ops, or GTM Strategy roles, with deep exposure to enterprise sales environments</li>
</ul>
<ul>
<li>Experience supporting complex, multi-stakeholder enterprise sales motions in high-growth B2B SaaS or platform companies</li>
</ul>
<ul>
<li>Proven ownership of sales systems, forecasting, and performance measurement at scale</li>
</ul>
<ul>
<li>Hands-on experience designing and managing enterprise sales compensation plans and reporting</li>
</ul>
<ul>
<li>Strong understanding of enterprise GTM metrics, planning cycles, and operating cadences</li>
</ul>
<ul>
<li>A clear point of view on how enterprise accounts should be engaged and how to operationalize that engagement at scale</li>
</ul>
<ul>
<li>Comfort influencing senior sales leaders and executives without direct authority</li>
</ul>
<ul>
<li>Excellent written and verbal communication skills, including experience building executive-level materials and dashboards</li>
</ul>
<ul>
<li>Strong command of GTM systems and tools (e.g., Salesforce, Clari, planning and reporting tools)</li>
</ul>
<ul>
<li>High attention to detail paired with the ability to operate at a strategic altitude</li>
</ul>
<ul>
<li>Demonstrated ability to operate as a senior IC with the ambition and capability to build and lead a team over time</li>
</ul>
<ul>
<li>Technical curiosity or experience working alongside technical products and teams; familiarity with AI, ML, or data platforms is a plus</li>
</ul>
<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training. Scale employees in eligible roles are also granted equity based compensation, subject to Board of Director approval. Your recruiter can share more about the specific salary range for your preferred location during the hiring process, and confirm whether the hired role will be eligible for equity grant. You’ll also receive benefits including, but not limited to: Comprehensive health, dental and vision coverage, retirement benefits, a learning and development stipend, and generous PTO. Additionally, this role may be eligible for additional benefits such as a commuter stipend.</p>
<p>Please reference the job posting&#39;s subtitle for where this position will be located. For pay transparency purposes, the base salary range for this full-time position in the locations of San Francisco, New York, Seattle is: $176,000-$220,000 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$176,000-$220,000 USD</Salaryrange>
      <Skills>RevOps, Sales Ops, GTM Strategy, CRM, Forecasting, Reporting, Planning Tools, Sales Compensation Plans, Quota Governance, Attainment Reporting, Executive-Level Dashboards, Leadership Visibility, Strategic Thought Partner, Enterprise Sales Leaders, Account Engagement Strategy, Funnel Design, Sales Execution Standards, Data Integrity, Process Clarity, Operational Discipline, AI, ML, Data Platforms, Technical Products, GTM Systems, Tools, Salesforce, Clari, Planning and Reporting Tools</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Scale AI</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale AI develops reliable AI systems for the world&apos;s most important decisions, providing high-quality data and full-stack technologies to power leading models.</Employerdescription>
      <Employerwebsite>https://scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4662232005</Applyto>
      <Location>San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6365e7d7-511</externalid>
      <Title>Senior Forward Deployed Data Scientist/Engineer</Title>
      <Description><![CDATA[<p>We&#39;re hiring a Senior Forward Deployed Data Scientist / Engineer to work directly with customers on ambiguous, high-impact problems at the intersection of data science, product development, and AI deployment.</p>
<p>This is not a traditional analytics role. On this team, data scientists do the core statistical and modeling work, but they also build real tools and products: evaluation explorers, operator workflows, decision-support systems, experimentation surfaces, and customer-specific AI/data applications that get used in production.</p>
<p>The right candidate is strong in first-principles problem solving, rigorous measurement, and technical execution. They know how to define metrics, design experiments, diagnose failures, and build systems that people actually use. They are also comfortable using modern AI-assisted development tools to prototype and iterate quickly without sacrificing reliability, observability, or judgment. Python and SQL matter in this role, but as execution fluency in service of building better products and making better decisions.</p>
<p>Responsibilities: Partner directly with enterprise customers to understand workflows, operational pain points, constraints, and success criteria Turn ambiguous business and product problems into measurable solutions with clear metrics, technical designs, and deployment plans Design and build internal and customer-facing data products, including evaluation tools, workflow applications, decision-support systems, and thin product layers on top of data/ML systems Build end-to-end solutions across data ingestion, transformation, experimentation, statistical modeling, deployment, monitoring, and iteration Design evaluation frameworks, benchmarks, and feedback loops for ML/LLM systems, human-in-the-loop workflows, and model-assisted operations Apply rigorous statistical thinking to experimentation, causal inference, metric design, forecasting, segmentation, diagnostics, and performance measurement Use AI-assisted development workflows to accelerate prototyping and product iteration, while maintaining strong engineering discipline Diagnose failure modes across data quality, model behavior, retrieval, workflow design, and user experience, and drive fixes into production Act as the voice of the customer to Product, Engineering, and Data Science, using field learnings to shape roadmap and platform capabilities</p>
<p>Requirements: 5+ years of experience in data science, machine learning, quantitative engineering, or another highly analytical technical role Proven track record of shipping data, ML, or AI systems that delivered measurable business or product impact Exceptional ability to structure ambiguous problems, define the right success metrics, and translate them into executable technical plans Strong foundation in statistics, experimentation, causal reasoning, and measurement Experience building tools or products, not just analyses , for example internal workflow tools, evaluation systems, operator-facing products, experimentation platforms, or customer-specific applications Hands-on fluency in Python, SQL, and modern data/AI tooling; able to inspect data, prototype quickly, debug deeply, and productionize solutions that work Comfort using AI-assisted coding and development workflows to move from idea to usable product quickly Strong communication and stakeholder management skills; able to work effectively with customers, engineers, product teams, and executives High ownership and bias toward shipping in fast-moving environments with incomplete information</p>
<p>Preferred qualifications: Experience in a forward deployed, solutions, consulting, or other client-facing technical role Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign</p>
<p>What success looks like: Success in this role means taking a messy, high-stakes customer problem and turning it into a deployed system that is actually used. Sometimes that system is a model. Sometimes it is an evaluation framework. Sometimes it is an operator-facing tool or a lightweight data product that changes how decisions get made. In all cases, success is defined by measurable impact, rigorous evaluation, and reliable execution.</p>
<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training. Scale employees in eligible roles are also granted equity based compensation, subject to Board of Director approval. Your recruiter can share more about the specific salary range for your preferred location during the hiring process, and confirm whether the hired role will be eligible for equity grant. You’ll also receive benefits including, but not limited to: Comprehensive health, dental and vision coverage, retirement benefits, a learning and development stipend, and generous PTO. Additionally, this role may be eligible for additional benefits such as a commuter stipend.</p>
<p>Salary Range: $167,200-$209,000 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$167,200-$209,000 USD</Salaryrange>
      <Skills>Python, SQL, Modern data/AI tooling, Statistics, Experimentation, Causal reasoning, Measurement, Data science, Machine learning, Quantitative engineering, Experience in a forward deployed, solutions, consulting, or other client-facing technical role, Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products, Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow, Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery, Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems, Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling, Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale AI</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale AI develops reliable AI systems for the world&apos;s most important decisions.</Employerdescription>
      <Employerwebsite>https://scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4636227005</Applyto>
      <Location>San Francisco, CA; New York, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b68ff4cc-e74</externalid>
      <Title>Data Engineer, Safeguards</Title>
      <Description><![CDATA[<p><strong>About the role</strong></p>
<p>Anthropic is looking for a Data Engineer to join the Safeguards team and build the data foundations that keep our AI systems safe. The Safeguards team works to monitor models, prevent misuse, and ensure user well-being.</p>
<p>You&#39;ll design and build the data pipelines, warehousing solutions, and analytical tooling that power our safety and trust efforts at scale. You&#39;ll work closely with engineers, data scientists, and policy teams to ensure the Safeguards organization has the data it needs to detect abuse patterns, measure the effectiveness of safety interventions, and make informed decisions about model behavior and enforcement.</p>
<p>This is a high-impact role where your work will directly support Anthropic&#39;s mission to develop AI that is safe and beneficial.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design, build, and maintain scalable data pipelines that support safety monitoring, abuse detection, and enforcement workflows</li>
<li>Develop and optimize data models and warehousing solutions to enable efficient analysis of large-scale usage and safety data</li>
<li>Build and maintain dashboards and reporting infrastructure that give Safeguards teams visibility into model behavior, misuse patterns, and enforcement outcomes</li>
<li>Collaborate with engineers to integrate data from multiple sources , including model outputs, user reports, and automated classifiers , into a unified analytical layer</li>
<li>Implement data quality frameworks, monitoring, and alerting to ensure the reliability of safety-critical data</li>
<li>Partner with research teams to surface data insights that inform model improvements and safety interventions</li>
<li>Develop self-service data tooling that enables stakeholders to explore safety data and generate reports independently</li>
<li>Contribute to data governance practices, including access controls, retention policies, and privacy-compliant data handling</li>
</ul>
<p><strong>You may be a good fit if you:</strong></p>
<ul>
<li>Have 3+ years of experience in data engineering, analytics engineering, or a related role</li>
<li>Are proficient in SQL and Python, with experience building and maintaining ETL/ELT pipelines</li>
<li>Have hands-on experience with modern data stack tools such as dbt, Airflow, Spark, or similar orchestration and transformation frameworks</li>
<li>Have worked with cloud data platforms (BigQuery, Redshift, Snowflake, or similar)</li>
<li>Are comfortable building dashboards and data visualizations using tools like Looker, Tableau, or Metabase</li>
<li>Communicate clearly and can translate complex data concepts for both technical and non-technical audiences</li>
<li>Are results-oriented, flexible, and willing to pick up slack even when it falls outside your job description</li>
<li>Care about the societal impacts of AI and are motivated by safety work</li>
</ul>
<p><strong>Strong candidates may have:</strong></p>
<ul>
<li>Experience with trust &amp; safety, integrity, fraud, or abuse detection data systems</li>
<li>Experience with large-scale event streaming systems (Kafka, Pub/Sub, Kinesis)</li>
<li>Built data infrastructure that supports ML model monitoring or evaluation</li>
<li>A background in statistical analysis, or experience collaborating closely with data scientists</li>
<li>Developed internal tooling or self-service analytics platforms</li>
</ul>
<p><strong>Strong candidates need not have:</strong></p>
<ul>
<li>A formal degree in Computer Science or a related field , we value practical experience and demonstrated ability over credentials</li>
<li>Prior experience in AI or machine learning , you&#39;ll learn the domain-specific context on the job</li>
<li>Previous experience at an AI safety or research organization</li>
<li>Deep expertise across every tool listed above , familiarity with a subset and a willingness to learn is enough</li>
</ul>
<p><strong>Logistics</strong></p>
<p>Minimum education: Bachelor’s degree or an equivalent combination of education, training, and/or experience Required field of study: A field relevant to the role as demonstrated through coursework, training, or professional experience Minimum years of experience: Years of experience required will correlate with the internal job level requirements for the position Location-based hybrid policy: Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices. Visa sponsorship: We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>
<p><strong>How we&#39;re different</strong></p>
<p>We believe that the highest-impact AI research will be big science. At Anthropic we work as a single cohesive team on just a few large-scale research efforts. And we value impact , advancing our long-term goals of steerable, trustworthy AI , rather than work on smaller and more specific puzzles. We view AI research as an empirical science, which has as much in common with physics and biology as with traditional efforts in computer science. We&#39;re an extremely collaborative group, and we host frequent research discussions to ensure that we are pursuing the highest-impact work at any given time. As such, we greatly value communication skills. The easiest way to understand our research directions is to read our recent research. This research continues many of the directions our team worked on prior to Anthropic, including: GPT-3, Circuit-Based Interpretability, Multimodal Neurons, Scaling Laws, AI &amp; Compute, Concrete Problems in AI Safety, and Learning from Human Preferences.</p>
<p><strong>Come work with us!</strong></p>
<p>Anthropic is a public benefit corporation headquartered in San Francisco. We offer competitive compensation and benefits, optional equity donation matching, generous vacation and parental leave, flexible working hours, and a lovely office space in which to collaborate with colleagues.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>£170,000-£220,000 GBP</Salaryrange>
      <Skills>SQL, Python, ETL/ELT pipelines, dbt, Airflow, Spark, cloud data platforms, BigQuery, Redshift, Snowflake, Looker, Tableau, Metabase</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a public benefit corporation that creates reliable, interpretable, and steerable AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5156057008</Applyto>
      <Location>London, UK</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>04c1ff49-2d1</externalid>
      <Title>Data Platform Solutions Architect (Professional Services)</Title>
      <Description><![CDATA[<p>We&#39;re hiring for multiple roles within our Professional Services team. As a Data Platform Solutions Architect, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Extensive experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 10% of the time</li>
</ul>
<p>[Preferred] Databricks Certification but not essential</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, technical project delivery, documentation and white-boarding skills, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8396801002</Applyto>
      <Location>London, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>273f3f27-7de</externalid>
      <Title>Staff Product Manager, Content Experience</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Product Manager to lead our Content Experience strategy. In this role, you will own how users discover, learn from, and act on content: across documentation, in-product help, AI-assisted guidance, and beyond.</p>
<p>You&#39;ll help define the future of content at Databricks by making it a first-class product experience and integrating content development into product development.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Owning the content experience end-to-end, ensuring it&#39;s helpful, intuitive, and actionable</li>
<li>Driving strategic improvements to content tooling and workflows</li>
<li>Building an architecture of participation that enables context experts to contribute directly to content</li>
<li>Integrating AI to transform content experiences</li>
<li>Defining metrics that matter and tracking content engagement, time-to-task, support deflection, and user satisfaction</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>7+ years of product management experience, with a proven track record of leading cross-functional initiatives and delivering high-impact user experiences</li>
<li>Deep understanding of developer tools, data platforms, or technical products with large surface areas</li>
<li>Strong systems mindset, comfortable designing scalable workflows, content architectures, and tooling integrations</li>
<li>Experience with developer documentation, content platforms, or product onboarding is a plus</li>
<li>Strong customer empathy and an obsession with helping users succeed</li>
<li>Familiarity with AI technologies (especially LLMs) and how they can be applied to content workflows and user guidance</li>
<li>Experience working with technical and non-technical contributors in a collaborative content ecosystem</li>
</ul>
<p>Pay Range Transparency: Databricks is committed to fair and equitable compensation practices. The pay range for this role is $181,700-$249,800 USD.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$181,700-$249,800 USD</Salaryrange>
      <Skills>Product Management, Content Strategy, AI Technologies, Developer Tools, Data Platforms, Technical Products, LLMs, Content Platforms, Product Onboarding</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8040989002</Applyto>
      <Location>San Francisco, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>5b244f27-9fd</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases. You will work with engagement managers to scope variety of professional services work with input from the customer.</p>
<p>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications. Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</p>
<p>Provide an escalated level of support for customer operational issues. You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</p>
<p>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</p>
<p>The ideal candidate will have 6+ years experience in data engineering, data platforms &amp; analytics, comfortable writing code in either Python or Scala, working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one, deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals, familiarity with CI/CD for production deployments, working knowledge of MLOps, design and deployment of performant end-to-end data architectures, experience with technical project delivery - managing scope and timelines, documentation and white-boarding skills, experience working with clients and managing conflicts, build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</p>
<p>Travel to customers 20% of the time.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461258002</Applyto>
      <Location>Raleigh, North Carolina</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0fe57d9a-28e</externalid>
      <Title>Engagement Manager</Title>
      <Description><![CDATA[<p>Job Title: Engagement Manager</p>
<p>We are seeking an experienced Engagement Manager to join our team in Tokyo. As an Engagement Manager, you will be responsible for driving customer success by ensuring that our customers are making the most value of our products and services.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Collaborate with sales counterparts to understand customer needs and develop valued solutions</li>
<li>Identify opportunities for new services and articulate the business value</li>
<li>Perform as the Engagement Manager in the assigned area with full accountability for meeting/exceeding Professional Services and Training bookings and revenue targets</li>
<li>Consult with clients to understand and analyze engagement scope, requirements, time, cost, and benefits</li>
<li>Drive resolution of delivery challenges, address resource contentions, scoping issues, and manage expectations</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Strong fundamental knowledge around Big Data Platforms Implementation from both the technology, operations, and security/governance lenses</li>
<li>Proven experience in selling services offerings at either implementation, advisory, education, and change management capacity</li>
<li>Senior customer-facing roles that require a mix of influencing, validating, negotiating, understanding, and execution to both business and technology audiences</li>
<li>Consistent track record of identifying customer needs and successfully implementing solutions</li>
<li>Owning projects/programs in agile scrum/kanban delivery methodology as well as waterfall methodology</li>
<li>Strong problem-solving skill about customer&#39;s pain points by using modern technologies</li>
<li>Excellence in presentation skills, providing proposals that enforce good project governance and drive scalable delivery practices to both internal and external executives</li>
<li>High-level orchestration skills to align both internal and external stakeholders when proposing large initiatives</li>
<li>Strong service delivery and program management skills with the ability to synthesize customer success outcomes into well-structured program plans that deliver against such outcomes</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Prior experience in project/program proposal to customers at Consulting, SI, Software/Cloud Vendor</li>
<li>Bachelor&#39;s degree in Computer Science or related educational background</li>
</ul>
<p>Benefits:</p>
<ul>
<li>Comprehensive benefits and perks that meet the needs of all employees</li>
</ul>
<p>Commitment to Diversity and Inclusion:</p>
<ul>
<li>Databricks is committed to fostering a diverse and inclusive culture where everyone can excel</li>
</ul>
<p>Compliance:</p>
<ul>
<li>Access to export-controlled technology or source code is required for performance of job duties, and it is within Employer&#39;s discretion whether to apply for a U.S. government license for such positions</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Big Data Platforms Implementation, Customer Success, Project Management, Program Management, Agile Scrum/Kanban Delivery Methodology, Waterfall Methodology, Problem-Solving Skill, Presentation Skills, Project Governance, Service Delivery, Prior Experience in Project/Program Proposal to Customers at Consulting, SI, Software/Cloud Vendor, Bachelor&apos;s Degree in Computer Science or Related Educational Background</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8501186002</Applyto>
      <Location>Tokyo, Japan</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>f1950023-ef7</externalid>
      <Title>Senior Engineering Manager, Activation</Title>
      <Description><![CDATA[<p>Why join us</p>
<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>
<p>Brex’s AI-native automation and world-class service eliminate manual expense and accounting tasks for customers so they can focus on what matters most. Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>
<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream. We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>
<p>Engineering</p>
<p>Engineering at Brex is about building systems that scale with speed and intention. Our teams span Software, Data, Security, and IT, and operate with high autonomy and deep collaboration. We tackle hard technical problems, own our outcomes, and push for excellence at every level , from architecture to deployment. It’s an environment where engineering is a craft, and builders become leaders.</p>
<p>What you’ll do</p>
<p>You will lead an engineering group focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, integrations, and implementation workflows that help customers realize value quickly. This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>
<p>The ideal candidate is a seasoned engineering leader with experience scaling user-facing onboarding systems, delivering high-quality product experiences, and partnering deeply across Product, Design, Operations, and GTM teams.</p>
<p>Where you’ll work</p>
<p>This role will be based in our New York office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>
<p>Responsibilities</p>
<ul>
<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding and implementation experience.</li>
</ul>
<ul>
<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>
</ul>
<ul>
<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>
</ul>
<ul>
<li>Drive execution of the Activation roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>
</ul>
<ul>
<li>Lead and manage multiple teams of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>
</ul>
<ul>
<li>Build systems that integrate identity verification, KYC and compliance workflows, customer data ingestion, and implementation tooling in a scalable and reliable manner.</li>
</ul>
<ul>
<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>
</ul>
<ul>
<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>
</ul>
<p>Requirements</p>
<ul>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>
</ul>
<ul>
<li>Strong technical background and understanding of software development principles.</li>
</ul>
<ul>
<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>
</ul>
<ul>
<li>Demonstrated track record of shipping customer-facing features across multiple release cycles.</li>
</ul>
<ul>
<li>3+ years of experience managing or leading multiple technical teams in a high-growth environment.</li>
</ul>
<ul>
<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>
</ul>
<ul>
<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer facing experiences.</li>
</ul>
<ul>
<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>
</ul>
<ul>
<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>
</ul>
<p>Bonus points</p>
<ul>
<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>
</ul>
<ul>
<li>You have started your own technology venture or were an early technical founder/employee. We value entrepreneurial spirit &amp; scrappiness!</li>
</ul>
<ul>
<li>You are a champion for the customer and constantly put yourself in their shoes to create intuitive, frictionless experiences.</li>
</ul>
<p>Compensation</p>
<p>The expected salary range for this role is $300,000 - $375,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$300,000 - $375,000</Salaryrange>
      <Skills>Technical leadership, Software development principles, Full-stack engineering, Customer-facing features, Data-driven mindset, AI-powered product experiences, LLM-driven automation, Personalization, Data platforms, Snowflake, Hex, Entrepreneurial spirit, Scrappiness, Customer obsession</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is a financial platform that provides corporate cards and banking services to companies in over 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8330492002</Applyto>
      <Location>New York, New York, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>22bcbb50-ef4</externalid>
      <Title>Member of Technical Staff - Data Platform</Title>
      <Description><![CDATA[<p><strong>About the Role</strong></p>
<p>The Data Platform team at xAI builds and operates the infrastructure responsible for all large-scale data transport and processing across the company.</p>
<p>As a software engineer on the Data Platform team, you will design, build, and operate the distributed systems powering X&#39;s data movement and compute.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design and implement high-throughput, low-latency data ingestion and transport systems.</li>
<li>Scale and optimise multi-tenant Kafka infrastructure supporting real-time workloads.</li>
<li>Extend and tune Spark, Flink, and Trino for demanding production pipelines.</li>
<li>Build interfaces, APIs, and pipelines enabling teams to query, process, and move data at petabyte scale.</li>
<li>Debug and optimise distributed systems, with a focus on reliability and performance under load.</li>
<li>Collaborate with ML, product, and infrastructure teams to unblock critical data workflows.</li>
</ul>
<p><strong>Basic Qualifications</strong></p>
<ul>
<li>Proven expertise in distributed systems, stream processing, or large-scale data platforms.</li>
<li>Proficiency in Rust, Go, Scala or similar systems languages.</li>
<li>Hands-on experience with Kafka, Flink, Spark, Trino, or Hadoop in production.</li>
<li>Strong debugging, profiling, and performance optimisation skills.</li>
<li>Track record of shipping and maintaining critical infrastructure.</li>
<li>Comfortable working in fast-moving, high-stakes environments with minimal guardrails.</li>
</ul>
<p><strong>Compensation and Benefits</strong></p>
<p>$180,000 - $440,000 USD</p>
<p>Base salary is just one part of our total rewards package at X, which also includes equity, comprehensive medical, vision, and dental coverage, access to a 401(k) retirement plan, short &amp; long-term disability insurance, life insurance, and various other discounts and perks.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,000 - $440,000 USD</Salaryrange>
      <Skills>Rust, Go, Scala, Kafka, Flink, Spark, Trino, Hadoop, distributed systems, stream processing, large-scale data platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>xAI</Employername>
      <Employerlogo>https://logos.yubhub.co/x.com.png</Employerlogo>
      <Employerdescription>xAI creates AI systems to understand the universe and aid humanity in its pursuit of knowledge.</Employerdescription>
      <Employerwebsite>https://www.x.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/xai/jobs/4803862007</Applyto>
      <Location>Palo Alto, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>477d343e-e37</externalid>
      <Title>Customer Success Architect</Title>
      <Description><![CDATA[<p>About Mixpanel</p>
<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence. Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees.</p>
<p>About the Customer Success Team:</p>
<p>Mixpanel’s Customer Success &amp; Solutions Engineering teams are analytics consultants who embed themselves within our enterprise customer teams to drive our customers’ business outcomes. We work with prospects and customers throughout the customer journey to understand what drives value and serve as the technical counterpart to our Sales organization to deliver on that value.</p>
<p>You will partner closely with Account Executives, Account Managers, Product, Engineering, and Support to successfully roll out self-serve analytics within our customers’ organizations, help the customer manage change, execute on technical projects and services that delight our customers, and ultimately drive ROI on the customer’s Mixpanel investment.</p>
<p>About the Role:</p>
<p>As a CSA, you will partner with customers throughout the customer journey to understand what drives value, beginning from the pre-sales running proof of concepts to demonstrate quick time to value, to post-sales onboarding and implementation, where you set customers up for long-term success with scalable implementation and data governance best practices. Throughout the entire customer lifecycle, you will work to understand how analytics can drive business value for your customers and will consult them on how to maximize the value of Mixpanel, including managing change during Mixpanel’s rollout, defining and achieving ROI, and identifying areas of improvement in their current usage of analytics.</p>
<p>For large enterprise customers, post onboarding, you will also continue alongside the Account Managers to drive data trust and product adoption for 100+ end user teams through a change management rollout approach.</p>
<p>Responsibilities:</p>
<p>Serve as a trusted technical advisor for prospects/customers to provide strategic consultation on data architecture, governance, instrumentation, and business outcomes</p>
<p>Effectively communicate at most levels of the customer’s organization to influence business outcomes via Mixpanel, design and execute a comprehensive analytics strategy, and unblock technical and organizational roadblocks</p>
<p>Own the customer’s success with Mixpanel , documenting and delivering ROI to the customer throughout their journey to transform their business with self-serve analytics</p>
<p>Own onboarding and data health for your assigned customers/projects, including ongoing enhancements to their data quality and overall tech stack integration</p>
<p>Engage with customers’ engineering, product management, and marketing teams to handle technical onboarding, optimize Mixpanel deployments, and improve data trust</p>
<p>Deliver a variety of technical services ranging from data architecture consultations to adoption and change management best practices</p>
<p>Leverage modern data architecture expertise to create scalable data governance practices and data trust for our customers, including data optimization and re-implementation projects</p>
<p>Successfully execute on success outcomes whilst balancing project timelines, scope creep, and unanticipated issues</p>
<p>Bridge the technical-business gap with your customers , working with business stakeholders to define a strategic vision for Mixpanel and then working with the right business and technical contacts to execute that vision</p>
<p>Collaborate with our technical and solutions partners as needed on data optimization and onboarding projects</p>
<p>Be a technical sponsor for internal engagements with Mixpanel product and engineering teams to prioritize product and systems tasks from clients</p>
<p>We&#39;re Looking For Someone Who Has</p>
<p>3 to 5 years of experience consulting on defining and delivering ROI through new tool implementations</p>
<p>Experience working with Director-level members of the customer organization to define a strategic vision and successfully leveraging those members to deliver on that vision</p>
<p>The ability to communicate with stakeholders at most levels of an organization , from talking with developers about the ins and outs of an API to talking to a Director of Data Science/Product Management about organizational efficiency</p>
<p>Can manage complex projects with assorted client stakeholders, working across teams and departments to execute real change</p>
<p>Has a demonstrated successful record of experience in customer success, client-facing professional services, consulting, or technical project management role</p>
<p>Excellent written, analytical, and communication skills</p>
<p>Strong process and/or project delivery discipline</p>
<p>Eager to learn new technologies and adapt to evolving customer needs</p>
<p>We&#39;d Be Extra Excited For Someone Who Has</p>
<p>Experience in data querying, modeling, and transforming in at least one core tool, including SQL / dbt / Python / Business Intelligence tools / Product Analytics tools, etc.</p>
<p>Familiar with databases and cloud data warehouses like Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, etc.</p>
<p>Familiar with product analytics implementation methods like SDKs, Customer Data Platforms (CDPs), Event Streaming, Reverse ETL, etc.</p>
<p>Familiar with analytics best practices across business segments and verticals</p>
<p>Benefits and Perks</p>
<p>Comprehensive Medical, Vision, and Dental Care</p>
<p>Mental Wellness Benefit</p>
<p>Generous Vacation Policy &amp; Additional Company Holidays</p>
<p>Enhanced Parental Leave</p>
<p>Volunteer Time Off</p>
<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>
<p>Culture Values</p>
<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>
<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>
<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>
<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>
<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>
<p>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</p>
<p>Why choose Mixpanel?</p>
<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>
<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>
<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>
<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>
<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>
<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>
<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>
<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, or any other protected characteristic.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data architecture, governance, instrumentation, business outcomes, data querying, modeling, transforming, SQL, dbt, Python, Business Intelligence tools, Product Analytics tools, databases, cloud data warehouses, Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, SDKs, Customer Data Platforms, Event Streaming, Reverse ETL</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a leading provider of digital analytics software, serving over 29,000 companies worldwide.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7506821</Applyto>
      <Location>Bengaluru, India (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>52e9ea6f-e2a</externalid>
      <Title>Enterprise Account Executive</Title>
      <Description><![CDATA[<p>The Enterprise Account Executive will report to the Director of Enterprise GTM and own revenue growth across a portfolio of Scale AI&#39;s largest and most strategic enterprise customers. This role is focused on selling complex, highly technical AI solutions into F500 organisations, partnering with executive, technical, and operational stakeholders to drive long-term value and expansion.</p>
<p>You will be responsible for full-cycle enterprise sales - from prospecting and deal strategy through close, renewal, and expansion - while serving as the quarterback across internal teams including Solutions Engineering, Product, Research, and Operations. This role requires strong ownership, executive presence, and the ability to navigate multi-stakeholder enterprise buying processes in a fast-paced environment.</p>
<p>Responsibilities:</p>
<ul>
<li>Own and drive relationships with Scale&#39;s largest and most complex Fortune 500 prospects and customers</li>
<li>Build trusted relationships with executive, technical, and operational stakeholders across multiple business units</li>
<li>Develop and execute comprehensive account strategies to drive net-new revenue, expansion, and long-term partnerships</li>
<li>Lead strategic deal planning and mutual close plans across new business, renewals, and expansions</li>
<li>Partner closely with Solutions Engineering and Product teams to deliver compelling, technically credible value propositions</li>
<li>Act as the voice of the customer internally, influencing product roadmap, research priorities, and delivery execution</li>
<li>Maintain deep understanding of customer business goals, AI maturity, and industry trends to proactively identify opportunities</li>
<li>Consistently communicate account health, pipeline, and forecast accuracy using Salesforce, Clari, and related tools</li>
</ul>
<p>Ideally, You Will Have:</p>
<ul>
<li>8+ years of enterprise sales or account management experience, including 2+ years selling deeply technical solutions to both business and technical audiences</li>
<li>A proven track record of closing and expanding large, complex enterprise deals</li>
<li>Demonstrated success consistently achieving or exceeding quota in enterprise sales roles</li>
<li>Experience building and executing long-term account strategies to drive sustained revenue growth</li>
<li>Strong ability to lead enterprise renewal processes from strategy through close</li>
<li>Excellent written and verbal communication skills, with comfort presenting to executive audiences</li>
<li>Strong command of enterprise sales processes and systems (Salesforce, Clari, Outreach, Slack)</li>
<li>A consultative, customer-first mindset with the ability to influence cross-functional internal teams</li>
<li>Experience developing executive-level materials and business cases</li>
<li>Strong project management, organisational skills, and attention to detail</li>
<li>Technical background or strong technical curiosity highly valued, especially familiarity with AI, ML, or data platforms</li>
</ul>
<p>Sales Commission:</p>
<p>This role is eligible to earn commissions. Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training.</p>
<p>Benefits:</p>
<p>Comprehensive health, dental and vision coverage, retirement benefits, a learning and development stipend, and generous PTO.</p>
<p>Additional benefits may include a commuter stipend.</p>
<p>Salary Range:</p>
<p>$207,200 - $259,000 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$207,200 - $259,000 USD</Salaryrange>
      <Skills>Enterprise sales, Account management, Technical sales, Executive presence, Communication skills, Project management, Organisational skills, Attention to detail, AI, ML, Data platforms</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Scale AI</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale AI develops reliable AI systems for the world&apos;s most important decisions, providing high-quality data and full-stack technologies to power leading models.</Employerdescription>
      <Employerwebsite>https://www.scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4646946005</Applyto>
      <Location>San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b3cf0ff9-4c6</externalid>
      <Title>Support Engineer II</Title>
      <Description><![CDATA[<p>About Mixpanel</p>
<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>
<p>Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees. Mixpanel delivers insights that customers trust.</p>
<p>Visit mixpanel.com to learn more.</p>
<p>About The Support Team</p>
<p>Mixpanel Support is a team of talented problem-solvers from diverse backgrounds. We care deeply about helping our customers be successful and enabling them to get value from their data.</p>
<p>We are located all over the world in San Francisco, Barcelona, London, and Singapore...</p>
<p>About The Role</p>
<p>The right candidate is an avid learner, an advocate for customers, and a collaborative teammate. The main responsibility of a Support Engineer is to help users solve technical challenges and use Mixpanel to make impactful product decisions.</p>
<p>We’ve had team members focus on developing their technical skills to join the product and engineering teams, hone their customer-facing skills to become customer success managers or sales engineers, and take on leadership roles in the Support organization.</p>
<p>Responsibilities</p>
<p>The core responsibility of a Support Engineer is to support our customers at every turn in the Mixpanel journey by providing answers to product questions, sharing best practices, and debugging technical issues.</p>
<p>You&#39;ll also develop your technical skills, collaborate with our Product team to improve our product, learn product analytics, and mentor new team members.</p>
<p>Become a Mixpanel product expert - you will help users understand our reports and features, help them use our APIs and SDKs, share best practices, and resolve account issues</p>
<p>Respond to customer inquiries via Zendesk email, chat, Slack, and phone calls</p>
<p>Investigate and document bugs and feature requests to share with our Product and Engineering teams</p>
<p>Provide feedback regarding internal support processes, product functionality, and customer education resources to improve the customer experience</p>
<p>Shape the product by regularly working closely with PM’s, engineers, and designers to incorporate customer learnings into change</p>
<p>We&#39;re Looking For Someone Who Has</p>
<p>Experience providing customer facing SAAS support (in customer support, professional services, technical account management or similar)</p>
<p>Ability to communicate technical concepts effectively in a clear, friendly writing style</p>
<p>Excellent problem-solving and analytical skills</p>
<p>Programming experience, understanding of web &amp; mobile technologies, and interacting with APIs</p>
<p>Experience with debugging and collaborating with engineering to resolve complex technical issues, especially with JavaScript, Python, or mobile technologies</p>
<p>Ability to be resourceful and resilient when faced with ambiguity and new challenges</p>
<p>Dedication to developing expertise in a complex and constantly evolving product</p>
<p>Interest and aptitude to develop technical skills and learn new technologies</p>
<p>Experience providing SLA based support and/or dedicated support to strategic customers</p>
<p>Speak Hebrew and fluent English</p>
<p>Bonus Points</p>
<p>Experience with Mixpanel or other analytics tools</p>
<p>Familiar with databases and cloud data warehouses like Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, etc.</p>
<p>Familiar with product analytics implementation methods like SDKs, Customer Data Platforms (CDPs), Event Streaming, Reverse ETL, etc.</p>
<p>Benefits and Perks</p>
<p>Comprehensive Medical, Vision, and Dental Care</p>
<p>Mental Wellness Benefit</p>
<p>Generous Vacation Policy &amp; Additional Company Holidays</p>
<p>Enhanced Parental Leave</p>
<p>Volunteer Time Off</p>
<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>
<p>Culture Values</p>
<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>
<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>
<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>
<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>
<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>
<p>Why choose Mixpanel?</p>
<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>
<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>
<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>
<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>
<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>
<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>
<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>
<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, veteran status, or disability status.</p>
<p>Pursuant to the San Francisco Fair Chance Ordinance or other similar laws that may be applicable, we will consider for employment qualified applicants with arrest and conviction records.</p>
<p>We’ve immersed ourselves in our Culture and Values as our guiding principles for the impact we want to have and the future we are building.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel></Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>customer facing SAAS support, technical concepts, problem-solving, programming experience, web &amp; mobile technologies, APIs, debugging, collaboration, SLA based support, dedicated support, Hebrew, English, Mixpanel, analytics tools, databases, cloud data warehouses, product analytics implementation methods, SDKs, Customer Data Platforms, Event Streaming, Reverse ETL</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a digital analytics platform that helps teams accelerate adoption, improve retention, and ship with confidence. It has over 29,000 customers, including Workday, Pinterest, LG, and Rakuten Viber.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7650541</Applyto>
      <Location>Tel Aviv, Israel (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6b0c92b4-c05</externalid>
      <Title>Sr. Manager, Field Engineering</Title>
      <Description><![CDATA[<p>We are looking for a dynamic Sr. Manager, Field Engineering to lead a team of Solution Architects within our Retail vertical. As a key member of our Field Engineering team, you will be responsible for helping customers in the Consumer Goods segment succeed with Databricks and providing outsized value to their businesses. You will also be responsible for maintaining a robust hiring pipeline, establishing relationships across the business, and partnering with sales leadership to hit sales and consumption targets.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Hiring, training, growing, and managing a team of Solutions Architects</li>
<li>Making customers in the Consumer Goods segment successful with Databricks and providing outsized value to their businesses</li>
<li>Maintaining a robust hiring pipeline at all times</li>
<li>Establishing relationships across the business to make customers and team successful</li>
<li>Partnering with sales leadership to hit sales and consumption targets</li>
</ul>
<p>The ideal candidate will have 7+ years of professional experience in the data space with a technical product, 3+ years of experience in the field, architecting and delivering data-driven solutions for major accounts within the Retail, Consumer Products, Travel &amp; Hospitality vertical, and a deep familiarity with the buy-side and supply-side ecosystem.</p>
<p>In addition, the candidate should have demonstrated expertise with data collaboration ecosystem, 3+ years of experience building and leading technical pre-sales teams, a deep technical understanding of the impact that Data + AI can drive within the Retail industry, and trusted advisor to technical executives who guide strategic data infrastructure decisions.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$192,100-$264,175 USD</Salaryrange>
      <Skills>data warehousing, big data, machine learning, data collaboration ecosystem, customer data platforms, clean rooms, data marketplaces</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8362888002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>2f962d3f-14e</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461218002</Applyto>
      <Location>Dallas, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>995b724b-c85</externalid>
      <Title>Senior Sales Engineer, Partnerships</Title>
      <Description><![CDATA[<p>We are seeking a Senior Sales Engineer, Partnerships to join our team. As a Senior Sales Engineer, you will be responsible for providing technical expertise and strategic enablement to partners, facilitating strategies to pursue avenues of revenue outside of the life sciences. This role bridges technical knowledge and business strategy, supporting partners during discovery, qualification, and solution design to showcase the value of Komodo&#39;s healthcare data and analytics platform.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Serve as a technical lead on 8-10 multiple strategic opportunities, directly influencing the deal cycles and accelerating revenue growth.</li>
<li>Become the definitive subject matter expert on Komodo&#39;s comprehensive suite of healthcare data assets and platform capabilities.</li>
<li>Garner subject matter expertise and ownership of a segment within the Partnerships / Channel Partnerships organization.</li>
<li>Develop scalable technical frameworks, demo environments, and reusable assets that have set new organizational standards with a heavy emphasis on agentic AI workflows.</li>
<li>Drive cross-functional initiatives by partnering with Product, Data Science, and Engineering to deliver customized, innovative solutions.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>7+ years of experience in Sales Engineering or Solutions Engineering with a focus on healthcare data and healthcare technology.</li>
<li>Proven track record of understanding and leveraging AI tools to enhance SaaS products or improve operational workflows.</li>
<li>Expertise in healthcare data (e.g., 837/835 transactions, NDC codes) and its practical applications in analytics, reporting, and decision-making.</li>
<li>Strong technical skills, including experience with APIs, data integration, cloud-based architectures (e.g., AWS, Azure), and analyzing large datasets.</li>
<li>An understanding and proficiency of data science techniques, specifically SQL, Python, and/or R.</li>
<li>Excellent communication and presentation skills, with the ability to train partners and translate complex technical concepts for diverse stakeholders.</li>
</ul>
<p>Preferred Skills:</p>
<ul>
<li>Experience working within the provider, payer, or financial service segments.</li>
<li>Technical certifications in AWS, Azure, or data platforms.</li>
<li>Experience with CRM platforms like Salesforce for managing partner and client interactions.</li>
<li>Familiarity with data visualization tools (e.g., Tableau, Looker) to create impactful partner training materials.</li>
<li>Knowledge of identity resolution and privacy-preserving linking technologies.</li>
<li>Prior experience developing joint business plans and co-sell strategies with channel partners.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$143,000-$193,000 USD</Salaryrange>
      <Skills>Sales Engineering, Healthcare Data, Healthcare Technology, AI Tools, APIs, Data Integration, Cloud-Based Architectures, Data Science Techniques, SQL, Python, R, Excellent Communication, Presentation Skills, Experience Working Within Provider, Payer, or Financial Service Segments, Technical Certifications in AWS, Azure, or Data Platforms, Experience with CRM Platforms Like Salesforce, Familiarity with Data Visualization Tools, Knowledge of Identity Resolution and Privacy-Preserving Linking Technologies, Prior Experience Developing Joint Business Plans and Co-Sell Strategies</Skills>
      <Category>Sales</Category>
      <Industry>Healthcare</Industry>
      <Employername>Komodo Health</Employername>
      <Employerlogo>https://logos.yubhub.co/komodohealth.com.png</Employerlogo>
      <Employerdescription>Komodo Health is a healthcare technology company that aims to reduce the global burden of disease by providing a comprehensive view of the US healthcare system.</Employerdescription>
      <Employerwebsite>https://www.komodohealth.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/komodohealth/jobs/8495825002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>64cc7147-8b9</externalid>
      <Title>Sr. Manager, Technical Solutions</Title>
      <Description><![CDATA[<p>We are looking for a Senior Technical Solutions Manager to grow, lead and manage the Technical solutions engineers and support teams in India. The Senior Technical Solutions Manager is responsible for building and managing a regional team of technical experts focused on resolving highly complex and long-running support tickets raised by Databricks customers while overseeing the Support operations.</p>
<p>Impact you will have:</p>
<ul>
<li>Build and manage a team of Technical Solution Engineers</li>
<li>Provide coaching and mentorship to the engineers</li>
<li>Identify and implement process improvements to meet or exceed regional performance KPIs.</li>
<li>Establish training plans and subject matter expertise within the team.</li>
<li>Drive support escalations and establish cross-functional collaboration to manage and resolve issues.</li>
<li>Be a player-coach and provide technical leadership to the regional support team.</li>
<li>Coordinate with Sales and field teams to address account-level concerns and drive adoption and usage of the Databricks platform.</li>
<li>Define quarterly goals and track them to completion to drive team growth and personal development.</li>
<li>Scale the organisation by developing processes and guidelines that promote operational efficiency</li>
<li>Demonstrate a true sense of ownership and coordinate action items with engineering and escalation teams to achieve timely resolution of customer issues.</li>
<li>Perform risk assessments and be a hands-on leader</li>
</ul>
<p>What we are looking for:</p>
<ul>
<li>Minimum 15 years of experience in the Tech Industry</li>
<li>SaaS Support, building, testing, and maintaining</li>
<li>Minimum 6+ years of managerial experience, leading a team of at least 6+ technical support engineers</li>
<li>Proven experience working with cloud native applications/SaaS (AWS, Azure, GCP), big data platforms, or Apache Spark™ in a technical capacity</li>
<li>Demonstrated experience in a customer-facing role managing a large regional team of technical support engineers.</li>
<li>Excellent analytical and troubleshooting skills.</li>
<li>Excellent customer facing, verbal and written communication skills</li>
<li>A team-oriented attitude and a high degree of comfort working in a startup environment</li>
<li>Hands-on experience in systems troubleshooting, networking, and Linux fundamentals, JVM troubleshooting, debugging of Java applications is preferred</li>
</ul>
<p>Benefits</p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SaaS Support, Cloud Native Applications, Big Data Platforms, Apache Spark, Java, Linux, Networking</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform. It has over 10,000 organisations worldwide as clients.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8341135002</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0036f074-845</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have: Databricks Certification</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, design and deployment of highly performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456966002</Applyto>
      <Location>Boston, Massachusetts</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c6a4bf88-cba</externalid>
      <Title>Engineering Manager, Spark Connect</Title>
      <Description><![CDATA[<p>We are building the world&#39;s best data and AI infrastructure platform so our customers can solve the world&#39;s toughest problems. The Spark Platform organisation builds the core technologies that power Databricks and the Apache Spark ecosystem. We are looking for an Engineering Manager to lead the Spark Connect team.</p>
<p>This leader will own the Databricks Spark Connect platform, drive reliability and execution for a critical Serverless component, and lead our open source Spark Connect strategy in Apache Spark. This includes owning the OSS roadmap, partnering across teams, and driving adoption of Spark Connect in the open source ecosystem.</p>
<p>Responsibilities:</p>
<ul>
<li>Lead the team responsible for Spark Connect, a critical component of Databricks Serverless.</li>
<li>Own platform reliability, scalability, and operational excellence.</li>
<li>Drive Databricks&#39; technical leadership in open source Spark Connect.</li>
<li>Define and execute the strategy for OSS Spark Connect adoption and ecosystem growth.</li>
<li>Partner across product, runtime, serverless, and OSS stakeholders to shape roadmap and architecture.</li>
<li>Hire and grow a strong engineering team with high technical and operational standards.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>10+ years of experience in distributed systems, infrastructure, or data platforms.</li>
<li>3+ years of experience managing high-performing engineering teams.</li>
<li>Strong technical depth and a track record of owning critical production systems.</li>
<li>Experience leading cross-functional initiatives and influencing technical direction.</li>
<li>Passion for open source, developer platforms, and large-scale systems.</li>
</ul>
<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipates utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>
<p>Zone 2 Pay Range $180,500-$248,150 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,500-$248,150 USD</Salaryrange>
      <Skills>distributed systems, infrastructure, data platforms, engineering management, open source, developer platforms, large-scale systems</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform to over 10,000 organizations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8502969002</Applyto>
      <Location>Bellevue, Washington</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>e68e5c3b-1e2</externalid>
      <Title>Lakebase Account Executive</Title>
      <Description><![CDATA[<p>We are seeking a Lakebase Account Executive to help customers modernize their operational data foundation with Databricks Lakebase, our fully-managed Postgres offering for intelligent applications.</p>
<p>As a Lakebase Account Executive, you will drive new Lakebase revenue by identifying, qualifying, and closing Lakebase opportunities within a defined territory, in partnership with regional Account Executives and the broader account team.</p>
<p>You will lead with outcomes for key Lakebase personas , including platform teams and developers, data teams, and central IT , articulating how Lakebase helps them ship features faster, simplify operational data architectures, and improve governance and cost efficiency.</p>
<p>You will sell the value of fully-managed Postgres for intelligent applications, positioning Lakebase as the optimal choice for operational workloads that power real-time, AI-driven experiences.</p>
<p>You will run complex, multi-threaded sales cycles from discovery and value hypothesis through commercial negotiation and close, navigating executive, technical, and line-of-business stakeholders.</p>
<p>You will orchestrate proof-of-value and POCs that validate Lakebase’s benefits for OLTP-style workloads, reverse ETL, and AI/ML-driven applications, in partnership with solution architects and specialists.</p>
<p>You will compete and win against legacy and cloud-native operational databases by leveraging our compete assets, benchmarks, and customer references.</p>
<p>You will align to measurable business outcomes such as performance, developer productivity, time-to-market for new features, cost reduction, and simplification of the operational data landscape.</p>
<p>You will partner cross-functionally with Product Management, Marketing, Customer Success, and Partner teams to shape territory plans, launch plays, and co-selling motions with key ISVs and GSIs.</p>
<p>You will enable the field by sharing Lakebase best practices, success stories, and sales motions with broader sales teams, helping scale Lakebase proficiency across the organization.</p>
<p>This role requires the ability to operate across two key motions simultaneously:</p>
<p>Establish top strategic focus accounts by engaging application development teams to create net-new intelligent applications leveraging Lakebase.</p>
<p>Drive longer-term Postgres standardization and migration within Databricks&#39; most strategic accounts.</p>
<p>Candidates should demonstrate how they can act as a force multiplier across multiple dimensions of the business.</p>
<p>Success in this role requires strength in four areas:</p>
<p>Business ownership – Operate at a business-unit level by tracking revenue, pipeline, and key observations, and by identifying areas needing additional focus or support.</p>
<p>Strategic account engagement – Partner with account teams to engage priority accounts across the global DB700, driving strategic opportunities from initial engagement through successful outcomes.</p>
<p>Field enablement – Build and execute enablement plans that empower AEs and SAs to confidently carry the Lakebase conversation even when the specialist is not present.</p>
<p>Market voice and thought leadership – Develop an internal and external presence by contributing to global AMAs and internal forums, and by representing Databricks at key first- and third-party events.</p>
<p>The interview process is designed to evaluate candidates across all four of these dimensions.</p>
<p>We are looking for a candidate with 7+ years of enterprise SaaS sales experience, consistently exceeding quota in complex, multi-stakeholder deals.</p>
<p>Proven success selling data platforms, operational databases (e.g., Postgres, MySQL, cloud-native DBaaS), or adjacent data/AI infrastructure to technical buyers and business leaders.</p>
<p>Strong understanding of modern data and application architectures, including cloud-native services, microservices, event-driven systems, and how operational data underpins AI and analytics strategies.</p>
<p>Ability to sell to both technical stakeholders (developers, architects, data engineers) and business stakeholders (product leaders, operations, line-of-business owners).</p>
<p>Demonstrated experience leading specialist or overlay motions, working jointly with core Account Executives to create and progress opportunities.</p>
<p>Executive presence with the ability to whiteboard architectures, lead C-level conversations, and build trust with senior decision makers.</p>
<p>Strong value selling skills: adept at discovering pain, building a business case, and tying technical capabilities to clear, quantified outcomes.</p>
<p>Excellent communication, storytelling, and negotiation skills, with comfort presenting to both large and small audiences.</p>
<p>Bachelor’s degree or equivalent practical experience.</p>
<p>Preferred qualifications include experience selling Postgres, operational databases, OLTP workloads, or transactional cloud database services, ideally within large or strategic accounts.</p>
<p>Familiarity with data platforms, lakehouse architectures, and cloud ecosystems (AWS, Azure, GCP), including how operational databases fit within broader data and AI strategies.</p>
<p>Understanding of reverse ETL, real-time decisioning, and operational analytics use cases, and how they drive value for customer-facing and internal applications.</p>
<p>Exposure to AI-native and agent-driven applications that depend on low-latency, highly scalable operational data services.</p>
<p>Prior experience in a high-growth, category-creating environment, helping shape new plays, messaging, and customer narratives.</p>
<p>Experience collaborating with partners and ISVs to drive joint pipeline and co-sell motions.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Postgres, operational databases, OLTP workloads, transactional cloud database services, data platforms, lakehouse architectures, cloud ecosystems, reverse ETL, real-time decisioning, operational analytics, AI-native applications, agent-driven applications, low-latency, highly scalable operational data services</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8449848002</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c6d97a39-7f0</externalid>
      <Title>Senior Engineering Manager, Activation</Title>
      <Description><![CDATA[<p>Job Title: Senior Engineering Manager, Activation</p>
<p>Join us at Brex, the intelligent finance platform that enables companies to spend smarter and move faster. We&#39;re looking for a seasoned engineering leader to lead our engineering group focused on building the systems and product experiences that power customer activation at Brex.</p>
<p>As a Senior Engineering Manager, you will be responsible for driving business and product strategies, collaborating with cross-functional partners, leveraging AI to reimagine and automate onboarding and implementation workflows, and leading and managing multiple teams of engineers.</p>
<p>Requirements:</p>
<ul>
<li>Bachelor&#39;s or Master&#39;s degree in Computer Science, Engineering, or a related field</li>
<li>Strong technical background and understanding of software development principles</li>
<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences</li>
<li>Demonstrated track record of shipping customer-facing features across multiple release cycles</li>
<li>3+ years of experience managing or leading multiple technical teams in a high-growth environment</li>
</ul>
<p>Bonus points:</p>
<ul>
<li>Experience with data platforms such as Snowflake, Hex, or similar</li>
<li>You have started your own technology venture or were an early technical founder/employee</li>
</ul>
<p>Compensation:</p>
<p>The expected salary range for this role is $300,000 - $375,000. However, the starting base pay will depend on a number of factors including the candidate&#39;s location, skills, experience, market demands, and internal pay parity.</p>
<p>If you&#39;re a champion for the customer and constantly put yourself in their shoes to create intuitive, frictionless experiences, we&#39;d love to hear from you!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$300,000 - $375,000</Salaryrange>
      <Skills>Leadership, Software Development, AI, Data Platforms, Full-Stack Engineering, Snowflake, Hex</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is a financial platform that provides corporate cards and banking services to companies in over 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8330487002</Applyto>
      <Location>San Francisco, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3a17bc01-d7d</externalid>
      <Title>Staff Software Engineer</Title>
      <Description><![CDATA[<p>DBT Labs is seeking a Staff Software Engineer to join our Engineering team. As a seasoned engineer, you will architect and build the durable memory substrate that powers agentic analytics workflows. This platform stores not just metadata, but meaning: decisions, intent, rationale, and history , and makes it safely accessible to humans, agents, and applications.</p>
<p>Your responsibilities will include:</p>
<ul>
<li>Prototyping apt technical solutions and finding best fits for the context engine.</li>
<li>Architecting and building the core Context Platform.</li>
<li>Designing schemas and primitives for Decision Memory and enterprise context.</li>
<li>Owning context storage systems (graph, vector, event/time-based).</li>
<li>Building read/write/query APIs used by agents, products, and external apps.</li>
<li>Designing permission-aware, auditable context access.</li>
</ul>
<p>You will be working closely with agentic systems engineers and product leadership to ensure the context engine is interoperable, portable, and zero-lock-in by design.</p>
<p>In this role, you will own:</p>
<ul>
<li>Context schemas and schema evolution strategies.</li>
<li>Storage and data modeling choices.</li>
<li>Platform APIs and interfaces.</li>
<li>Security, identity propagation, and audit foundations.</li>
<li>Long-term scalability and correctness of context data.</li>
</ul>
<p>You will not own:</p>
<ul>
<li>Agent behavior or orchestration logic.</li>
<li>Business rules or governance policy decisions.</li>
<li>Product UI or workflow automation.</li>
</ul>
<p>The ideal candidate will have significant experience building distributed systems, data platforms, or infrastructure, and will be comfortable operating in ambiguous, greenfield problem spaces. They will also have deep expertise in data modeling and schema design, experience designing shared platforms used by many teams, and strong instincts around APIs, contracts, and backward compatibility.</p>
<p>Nice to have experience with knowledge graphs, metadata systems, or search/retrieval systems, experience building systems with governance, auditability, or compliance requirements, and familiarity with dbt or modern analytics stacks or developer tooling.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Distributed systems, Data platforms, Infrastructure, Data modeling, Schema design, APIs, Contracts, Backward compatibility, Knowledge graphs, Metadata systems, Search/retrieval systems, dbt, Modern analytics stacks, Developer tooling</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>dbt Labs</Employername>
      <Employerlogo>https://logos.yubhub.co/getdbt.com.png</Employerlogo>
      <Employerdescription>dbt Labs is a leading analytics engineering platform, now used by over 90,000 teams every week, driving data transformations and AI use cases.</Employerdescription>
      <Employerwebsite>https://www.getdbt.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/dbtlabsinc/jobs/4661362005</Applyto>
      <Location>India - Remote</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>58a44dab-91a</externalid>
      <Title>Partner Solutions Architect - Japan</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across Japan. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>
<p>You will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud.</p>
<p>Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing. This is not a purely reactive enablement role. The Partner SA is expected to help shape and execute repeatable partner plays that create revenue.</p>
<p>That includes enabling partner sellers and architects, supporting account mapping and seller-to-seller engagement, helping define joint value propositions, supporting partner-led pipeline generation, and influencing product and field strategy based on what is learned in-market.</p>
<p>Internal operating docs show this motion consistently includes enablement sessions, QBR sponsorships, account planning, workshops, field events, and targeted campaigns designed to produce sourced and influenced pipeline.</p>
<p>You&#39;ll be part of a team helping dbt scale its ecosystem through better partner capability, tighter field alignment, and more repeatable pipeline generation. The role is especially important as dbt continues investing in structured partner motions and deeper engagement with major cloud and data platform partners.</p>
<p>What you&#39;ll do:</p>
<ul>
<li>Partner closely with Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>
</ul>
<ul>
<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>
</ul>
<ul>
<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>
</ul>
<ul>
<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>
</ul>
<ul>
<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>
</ul>
<ul>
<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>
</ul>
<ul>
<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>
</ul>
<ul>
<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>
</ul>
<ul>
<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>
</ul>
<ul>
<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>
</ul>
<ul>
<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>
</ul>
<ul>
<li>Travel approximately 30-40% to support partner planning, enablement, executive meetings, and field events across Japan</li>
</ul>
<p>This scope reflects how the Partner SA team is already operating: enabling partner field teams, building account-level alignment, supporting QBRs and regional events, and translating those activities into sourced and engaged pipeline.</p>
<p>What you&#39;ll need:</p>
<ul>
<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>
</ul>
<ul>
<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>
</ul>
<ul>
<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>
</ul>
<ul>
<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>
</ul>
<ul>
<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>
</ul>
<ul>
<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>
</ul>
<ul>
<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>
</ul>
<ul>
<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>
</ul>
<ul>
<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>
</ul>
<ul>
<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>
</ul>
<p>What will make you stand out:</p>
<ul>
<li>Experience working directly in partner, alliance, or ecosystem roles</li>
</ul>
<ul>
<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>
</ul>
<ul>
<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>
</ul>
<ul>
<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>
</ul>
<ul>
<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>
</ul>
<ul>
<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>
</ul>
<ul>
<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>
</ul>
<ul>
<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>
</ul>
<p>What to expect in the interview process (all video interviews unless accommodations are needed):</p>
<ul>
<li>Interview with Talent Acquisition Partner</li>
</ul>
<ul>
<li>Interview with Hiring Manager</li>
</ul>
<ul>
<li>Team Interviews</li>
</ul>
<ul>
<li>Demo Round</li>
</ul>
<p>#LI-LA1</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner engineering, customer-facing technical role, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>dbt Labs</Employername>
      <Employerlogo>https://logos.yubhub.co/getdbt.com.png</Employerlogo>
      <Employerdescription>dbt Labs is a pioneer of analytics engineering, helping data teams transform raw data into reliable, actionable insights. It has grown from an open source project into the leading analytics engineering platform, now used by over 90,000 teams every week.</Employerdescription>
      <Employerwebsite>https://www.getdbt.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673657005</Applyto>
      <Location>Japan - Remote</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c2aaf7ac-804</externalid>
      <Title>Security Engineer - Threat Detection</Title>
      <Description><![CDATA[<p><strong>Job Description</strong></p>
<p>You will design, build, and maintain detections that identify malicious activity across Stripe&#39;s infrastructure, applications, and cloud environments.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design, build, and tune high-fidelity detections across modern SIEM platforms, covering adversary TTPs across the full attack lifecycle</li>
<li>Develop detection hypotheses by researching TTPs, identifying evidence sources, and determining detection opportunities across available telemetry</li>
<li>Conduct hypothesis-driven threat hunts to identify malicious activity, uncover detection gaps, and validate security controls</li>
<li>Perform malware analysis and reverse engineering to extract indicators and inform detection strategies</li>
<li>Build network-based detections (flow, pcap, protocol analysis) and endpoint-based detections (event logs, EDR telemetry, memory/file artifacts) across Windows, Linux, and macOS</li>
<li>Partner with Threat Intelligence to operationalize intel reports into detections, hunting leads, and enrichment logic</li>
<li>Collaborate with IR, SOC, and offensive security teams to validate and refine detections based on real-world incidents and red team exercises</li>
<li>Build data pipelines, automation, and tooling that enable detection-as-code practices and scalable deployment</li>
<li>Map detection coverage to MITRE ATT&amp;CK, identifying and prioritizing gaps across key attack surfaces</li>
<li>Lead projects, mentor teammates, and champion quality standards within the team</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5+ years of experience in detection engineering, threat hunting, or security operations</li>
<li>Demonstrated experience writing detection logic in modern SIEM platforms (e.g., Splunk, Chronicle, Elastic, CrowdStrike NG-SIEM, Panther, Microsoft Sentinel)</li>
<li>Strong understanding of adversary tradecraft across the attack lifecycle: initial access, privilege escalation, lateral movement, defense evasion, persistence, and exfiltration</li>
<li>Ability to extract TTPs from threat intelligence reports and translate them into detection opportunities</li>
<li>Experience developing network-based and endpoint-based detections across multiple OS platforms (Windows, Linux, macOS)</li>
<li>Experience analyzing telemetry across endpoint, network, cloud (AWS/GCP/Azure), identity, and application log sources</li>
<li>Proficiency in detection/query languages (SPL, KQL, EQL, YARA-L, SQL) and programming (Python or similar)</li>
<li>Strong communication skills with the ability to document detection logic and explain findings to technical and non-technical audiences</li>
<li>Adversarial mindset , understanding how attackers operate to build detections that catch real-world threats</li>
</ul>
<p><strong>Preferred Qualifications</strong></p>
<ul>
<li>Experience in detection engineering or threat hunting within fintech, financial services, or highly regulated environments</li>
<li>Background in malware analysis, reverse engineering, or threat research</li>
<li>Experience with purple team operations , collaborating with offensive security to validate detections</li>
<li>Familiarity with big data platforms (Databricks, Trino, PySpark) for large-scale log analysis</li>
<li>Proficiency with AI/LLM-assisted development tools (Claude Code, Cursor, GitHub Copilot) applied to detection workflows</li>
<li>Interest in agentic automation , using LLMs to augment hunting, tuning, or triage</li>
<li>Experience with detection validation tools (Atomic Red Team, ATT&amp;CK Evaluations)</li>
<li>Contributions to open-source detection content, research, or conference presentations</li>
<li>Relevant certifications such as HTB CDSA, GCIH, GCFA, GNFA, OSCP, TCM PMAT, or GREM</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>detection engineering, threat hunting, security operations, SIEM platforms, adversary tradecraft, network-based detections, endpoint-based detections, telemetry analysis, detection/query languages, programming, communication skills, fintech, financial services, malware analysis, reverse engineering, purple team operations, big data platforms, AI/LLM-assisted development tools, agentic automation, detection validation tools, open-source detection content, relevant certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Stripe</Employername>
      <Employerlogo>https://logos.yubhub.co/stripe.com.png</Employerlogo>
      <Employerdescription>Stripe is a financial infrastructure platform for businesses, used by millions of companies worldwide.</Employerdescription>
      <Employerwebsite>https://stripe.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/stripe/jobs/7827230</Applyto>
      <Location>Ireland</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>4ea7999b-3d8</externalid>
      <Title>Resident Solutions Architect - Healthcare &amp; Life Sciences</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494145002</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>d6421dea-6e3</externalid>
      <Title>Strategic Hunter Account Executive - Lakebase</Title>
      <Description><![CDATA[<p>We are seeking a Strategic Hunter Account Executive to help customers modernize their operational data foundation with Databricks Lakebase, our fully-managed Postgres offering for intelligent applications.</p>
<p>This high-impact role sits within the Lakebase Go-To-Market team and partners closely with regional Account Executives to drive adoption of Lakebase with platform, application, and data teams.</p>
<p>Lakebase gives customers a unified, governed foundation for operational workloads and AI-native applications, helping them move away from a fragmented estate of point databases toward a modern, scalable, serverless Postgres service.</p>
<p>If you want to be at the forefront of operational databases for AI and intelligent applications at one of the fastest-growing data and AI companies in the world, this is your opportunity.</p>
<p><strong>The impact you will have</strong></p>
<ul>
<li>Drive new Lakebase revenue by identifying, qualifying, and closing Lakebase opportunities within a defined territory, in partnership with regional Account Executives and the broader account team.</li>
</ul>
<ul>
<li>Lead with outcomes for key Lakebase personas , including platform teams and developers, data teams, and central IT , articulating how Lakebase helps them ship features faster, simplify operational data architectures, and improve governance and cost efficiency.</li>
</ul>
<ul>
<li>Sell the value of fully-managed Postgres for intelligent applications, positioning Lakebase as the optimal choice for operational workloads that power real-time, AI-driven experiences.</li>
</ul>
<ul>
<li>Run complex, multi-threaded sales cycles from discovery and value hypothesis through commercial negotiation and close, navigating executive, technical, and line-of-business stakeholders.</li>
</ul>
<ul>
<li>Orchestrate proof-of-value and POCs that validate Lakebase’s benefits for OLTP-style workloads, reverse ETL, and AI/ML-driven applications, in partnership with solution architects and specialists.</li>
</ul>
<ul>
<li>Compete and win against legacy and cloud-native operational databases by leveraging our compete assets, benchmarks, and customer references.</li>
</ul>
<ul>
<li>Align to measurable business outcomes such as performance, developer productivity, time-to-market for new features, cost reduction, and simplification of the operational data landscape.</li>
</ul>
<ul>
<li>Partner cross-functionally with Product Management, Marketing, Customer Success, and Partner teams to shape territory plans, launch plays, and co-selling motions with key ISVs and GSIs.</li>
</ul>
<ul>
<li>Enable the field by sharing Lakebase best practices, success stories, and sales motions with broader sales teams, helping scale Lakebase proficiency across the organization.</li>
</ul>
<p><strong>What success looks like in this role</strong></p>
<p>This role requires the ability to operate across two key motions simultaneously:</p>
<ul>
<li>Establish top strategic focus accounts by engaging application development teams to create net-new intelligent applications leveraging Lakebase.</li>
</ul>
<ul>
<li>Drive longer-term Postgres standardization and migration within Databricks&#39; most strategic accounts.</li>
</ul>
<p>Candidates should demonstrate how they can act as a force multiplier across multiple dimensions of the business.</p>
<p>Success in this role requires strength in four areas:</p>
<ul>
<li>Business ownership – Operate at a business-unit level by tracking revenue, pipeline, and key observations, and by identifying areas needing additional focus or support.</li>
</ul>
<ul>
<li>Strategic account engagement – Partner with account teams to engage priority accounts across the global DB700, driving strategic opportunities from initial engagement through successful outcomes.</li>
</ul>
<ul>
<li>Field enablement – Build and execute enablement plans that empower AEs and SAs to confidently carry the Lakebase conversation even when the specialist is not present.</li>
</ul>
<p>Market voice and thought leadership – Develop an internal and external presence by contributing to global AMAs and internal forums, and by representing Databricks at key first- and third-party events.</p>
<p><strong>What we look for</strong></p>
<ul>
<li>7+ years of enterprise SaaS sales experience, consistently exceeding quota in complex, multi-stakeholder deals.</li>
</ul>
<ul>
<li>Proven success selling data platforms, operational databases (e.g., Postgres, MySQL, cloud-native DBaaS), or adjacent data/AI infrastructure to technical buyers and business leaders.</li>
</ul>
<ul>
<li>Strong understanding of modern data and application architectures, including cloud-native services, microservices, event-driven systems, and how operational data underpins AI and analytics strategies.</li>
</ul>
<ul>
<li>Ability to sell to both technical stakeholders (developers, architects, data engineers) and business stakeholders (product leaders, operations, line-of-business owners).</li>
</ul>
<ul>
<li>Demonstrated experience leading specialist or overlay motions, working jointly with core Account Executives to create and progress opportunities.</li>
</ul>
<ul>
<li>Executive presence with the ability to whiteboard architectures, lead C-level conversations, and build trust with senior decision makers.</li>
</ul>
<ul>
<li>Strong value selling skills: adept at discovering pain, building a business case, and tying technical capabilities to clear, quantified outcomes.</li>
</ul>
<ul>
<li>Excellent communication, storytelling, and negotiation skills, with comfort presenting to both large and small audiences.</li>
</ul>
<ul>
<li>Bachelor’s degree or equivalent practical experience.</li>
</ul>
<p><strong>Preferred qualifications</strong></p>
<ul>
<li>Experience selling Postgres, operational databases, OLTP workloads, or transactional cloud database services, ideally within large or strategic accounts.</li>
</ul>
<ul>
<li>Familiarity with data platforms, lakehouse architectures, and cloud ecosystems (AWS, Azure, GCP), including how operational databases fit within broader data and AI strategies.</li>
</ul>
<ul>
<li>Understanding of reverse ETL, real-time decisioning, and operational analytics use cases, and how they drive value for customer-facing and internal applications.</li>
</ul>
<ul>
<li>Exposure to AI-native and agent-driven applications that depend on low-latency, highly scalable operational data services.</li>
</ul>
<ul>
<li>Prior experience in a high-growth, category-creating environment, helping shape new plays, messaging, and customer narratives.</li>
</ul>
<ul>
<li>Experience collaborating with partners and ISVs to drive joint pipeline and co-sell motions.</li>
</ul>
<p><strong>Benefits</strong></p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please click here.</p>
<p><strong>Our Commitment to Diversity and Inclusion</strong></p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data platforms, operational databases, Postgres, MySQL, cloud-native DBaaS, data/AI infrastructure, technical buyers, business leaders, modern data and application architectures, cloud-native services, microservices, event-driven systems, AI and analytics strategies, technical stakeholders, business stakeholders, value selling skills, discovering pain, building a business case, quantified outcomes, communication, storytelling, negotiation skills, OLTP workloads, transactional cloud database services, lakehouse architectures, cloud ecosystems, reverse ETL, real-time decisioning, operational analytics use cases, AI-native applications, agent-driven applications, high-growth environments, category-creating environments, partner collaborations, ISV collaborations</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8477547002</Applyto>
      <Location>Bengaluru, India; Mumbai, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>47807ca3-e36</externalid>
      <Title>Strategic AI/BI Account Executive</Title>
      <Description><![CDATA[<p>We are seeking a Strategic AI/BI Account Executive to help enterprise customers transform how business users interact with data. This high-impact role sits within the AI Go-To-Market team and partners closely with Enterprise Account Executives to drive adoption of Databricks AI/BI and Genie in APJ.</p>
<p>You will help organisations move beyond static dashboards to governed, conversational, AI-powered analytics at the centre of the convergence of business intelligence, data platforms, and generative AI. Enterprise analytics is rapidly evolving from dashboards and static reporting to conversational, AI-driven decision platforms. Databricks AI/BI and Genie empower business users to securely interact with governed data using natural language, transforming the data platform into a true decision platform.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Partner with Enterprise AEs to identify, qualify, and close AI/BI opportunities</li>
<li>Engage C-level, analytics, and line-of-business leaders to modernise analytics strategies</li>
<li>Displace or expand legacy BI platforms with AI-powered, governed analytics solutions</li>
<li>Lead conversations around semantic governance, self-service analytics, and natural language data access</li>
<li>Drive proof-of-value engagements and scale enterprise-wide adoption</li>
<li>Align AI/BI initiatives to measurable business outcomes (productivity, speed to insight, revenue impact)</li>
<li>Enable field teams and serve as a subject matter expert on modern analytics architectures</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>Enterprise sales experience in BI, analytics, data platforms, or AI/ML</li>
<li>Strong understanding of modern analytics architectures and data governance</li>
<li>Ability to sell to both technical and business stakeholders</li>
<li>Executive presence and experience navigating complex buying cycles</li>
<li>Passion for AI and the impact of GenAI on enterprise analytics</li>
<li>Experience operating in a specialist or overlay sales model</li>
<li>Ability to translate technical capabilities into clear business value</li>
<li>7+ years of Enterprise Sales experience, exceeding quotas in larger accounts</li>
</ul>
<p>Preferred qualifications include:</p>
<ul>
<li>Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot</li>
<li>Familiarity with semantic layers, metrics stores, or governed data models</li>
<li>Understanding of lakehouse architectures and cloud data platforms</li>
<li>Exposure to GenAI, natural language interfaces, or conversational applications</li>
<li>Consulting or solution design experience in customer-facing roles</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise sales experience in BI, analytics, data platforms, or AI/ML, Strong understanding of modern analytics architectures and data governance, Ability to sell to both technical and business stakeholders, Executive presence and experience navigating complex buying cycles, Passion for AI and the impact of GenAI on enterprise analytics, Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot, Familiarity with semantic layers, metrics stores, or governed data models, Understanding of lakehouse architectures and cloud data platforms, Exposure to GenAI, natural language interfaces, or conversational applications, Consulting or solution design experience in customer-facing roles</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company with over 10,000 organisations worldwide relying on its data intelligence platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8441884002</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>f560b1d5-028</externalid>
      <Title>Senior Digital Programs Manager</Title>
      <Description><![CDATA[<p>We are seeking an innovative and operationally-minded Digital Programs Manager to design, build, and maintain the programs, operations, and automation infrastructure that optimize the customer experience at scale and drive operational efficiency across all segments.</p>
<p>This critical role is focused on maximizing retention by delivering a seamless, valuable, and consistent service through a hybrid digital and human approach, directly improving product adoption and customer engagement.</p>
<p>You will establish a digital-first baseline of automated touchpoints for all scaled (downmarket) customers, complete with clear, data-driven escalation paths to human support for complex issues.</p>
<p>Simultaneously, you will deliver workflows and automation that enable our Customer Success Architects to work faster and smarter.</p>
<p>The ideal candidate thrives at the intersection of process, technology, and customer experience, and will be responsible for creating the playbooks and automations required to service a large volume of customers effectively and efficiently.</p>
<p>Responsibilities:</p>
<ul>
<li>Define and execute the comprehensive digital Customer Experience (CX) strategy to align with overall business objectives, maximize customer value, and proactively address the needs of the scaled segment.</li>
</ul>
<ul>
<li>Architect and deploy efficient and effective digital workflows for core customer journeys, including standardized customer onboarding and continuous lifecycle engagement programs.</li>
</ul>
<ul>
<li>Manage end-to-end digital programs (e.g., onboarding, adoption campaigns, renewal notifications) tailored for the scaled customer segment.</li>
</ul>
<ul>
<li>Design and execute campaigns using digital channels (email, Slack, webinars, etc) to drive feature adoption and sustained product engagement.</li>
</ul>
<ul>
<li>Continuously test, measure, and iterate on program performance to improve conversion rates, customer satisfaction scores (CSAT), and other key performance indicators (KPIs).</li>
</ul>
<ul>
<li>Design the escalation logic and scoring models that trigger human intervention from automated sequences.</li>
</ul>
<ul>
<li>Support the growth of our business by automating workflow elements for our higher-touch Enterprise team, such as programmatically identifying and flagging customer risk and surfacing high-value upsell opportunities.</li>
</ul>
<ul>
<li>Create and document clear, repeatable operations playbooks and Standard Operating Procedures (SOPs) for key digital customer journeys.</li>
</ul>
<ul>
<li>Serve as the primary liaison, working with Customer Success, Product, Sales, and Marketing teams to ensure alignment, gather requirements, and guarantee the effective execution of all digital CX initiatives.</li>
</ul>
<ul>
<li>Collaborate with our data engineering and ops teams to ensure data cleanliness and segmentation accuracy within our customer systems to enable highly targeted and personalized digital outreach.</li>
</ul>
<ul>
<li>Own, track, and analyze key program metrics and operational KPIs (e.g., digital engagement rates, adoption rates, churn reduction, customer health scores).</li>
</ul>
<ul>
<li>Provide regular, insightful reporting to leadership and relevant stakeholders on the overall effectiveness, performance, and impact of digital programs within the scaled customer segment.</li>
</ul>
<ul>
<li>Stay current with industry trends, emerging technologies, and best practices in digital CX. Iterate on programs based on direct customer feedback and data-driven insights.</li>
</ul>
<p>We&#39;re Looking For Someone Who Has:</p>
<ul>
<li>5+ years of experience in Program Management, Customer Success Operations, Digital Success, or a related role, preferably supporting a high-volume, scaled customer segment with hybrid digital/human experience and/or pooled coverage (B2B SaaS experience is a plus).</li>
</ul>
<ul>
<li>Demonstrated experience in building, launching, and scaling digital programs designed to influence customer behavior (adoption, engagement, retention). Proven impact on activation and value adoption (beyond open rates/clicks)</li>
</ul>
<ul>
<li>Strong operational skills, with expertise in process mapping, creating playbooks, and defining automation requirements.</li>
</ul>
<ul>
<li>Proficiency with CRM systems, Marketing Automation platforms, and CS software.</li>
</ul>
<ul>
<li>Excellent analytical skills and a data-driven approach, comfortable using data to tell a story and make recommendations.</li>
</ul>
<ul>
<li>Excellent written and communication skills</li>
</ul>
<ul>
<li>Strong process and project delivery discipline</li>
</ul>
<ul>
<li>Eager to learn new technologies and adapt to evolving customer needs</li>
</ul>
<p>We&#39;d Be Extra Excited For Someone Who Has:</p>
<ul>
<li>Familiarity with Mixpanel, or a similar analytics tool, including familiarity with analytics implementation methods like SDKs, Customer Data Platforms (CDPs), and Event Streaming.</li>
</ul>
<ul>
<li>Ability to build, script, or configure custom solutions to drive process automation or custom workflow creation.</li>
</ul>
<ul>
<li>Experience writing SQL queries to pull, validate, and analyze customer data directly from a database.</li>
</ul>
<ul>
<li>Experience architecting systems and data flows</li>
</ul>
<ul>
<li>Familiarity with analytics best practices across business segments and verticals</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$163,500-$199,500 USD</Salaryrange>
      <Skills>Digital Programs Management, Customer Success Operations, Digital Success, Program Management, Customer Experience, Process Mapping, Automation Requirements, CRM Systems, Marketing Automation Platforms, CS Software, Data-Driven Approach, Analytics, SQL Queries, Database Analysis, System Architecture, Data Flows, Mixpanel, Analytics Tool, SDKs, Customer Data Platforms, Event Streaming, Process Automation, Custom Workflow Creation</Skills>
      <Category>Operations</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a digital analytics platform that helps teams accelerate adoption, improve retention, and ship with confidence. It has over 29,000 customers worldwide.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7568212</Applyto>
      <Location>New York City, US (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>015afe59-9fd</externalid>
      <Title>Data Analyst II</Title>
      <Description><![CDATA[<p>Why join us</p>
<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>
<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>
<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>
<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>
<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>
<p>Data at Brex</p>
<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>
<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>
<p>What you’ll do</p>
<p>As a Data Analyst II (DA), you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>
<p>You will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses.</p>
<p>This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>
<p>Where you’ll work</p>
<p>This role will be based in our New York office.</p>
<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>
<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>
<p>As a perk, we also have up to four weeks per year of fully remote work!</p>
<p>Responsibilities</p>
<p>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</p>
<p>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</p>
<p>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</p>
<p>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</p>
<p>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</p>
<p>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</p>
<p>Contribute to the automation of recurring analyses and reporting workflows using Python.</p>
<p>Requirements</p>
<p>3+ years of experience in data analytics or a related role in a professional setting.</p>
<p>2+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</p>
<p>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</p>
<p>Experience with Python for data analysis, automation, or scripting.</p>
<p>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</p>
<p>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</p>
<p>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</p>
<p>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</p>
<p>Bonus points</p>
<p>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</p>
<p>Familiarity with dbt for data modeling and transformation.</p>
<p>Exposure to data pipeline orchestration tools (e.g., Airflow).</p>
<p>Experience in fintech, financial services, or payments.</p>
<p>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</p>
<p>Compensation</p>
<p>The expected salary range for this role is $93,600 - $117,000.</p>
<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>
<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$93,600 - $117,000</Salaryrange>
      <Skills>SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, Financial services, Payments</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is an intelligent finance platform that enables companies to spend smarter and move faster in over 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8463702002</Applyto>
      <Location>New York, New York, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>85f1f87e-70f</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
</ul>
<ul>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have: Databricks Certification</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461327002</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>ffd169d9-40b</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, data platforms &amp; analytics, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified data intelligence platform to over 10,000 organizations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461239002</Applyto>
      <Location>Atlanta, Georgia</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>760c3e88-e35</externalid>
      <Title>Senior Product Manager, Data</Title>
      <Description><![CDATA[<p>Job Title: Senior Product Manager, Data</p>
<p>We are seeking a Senior Product Manager to support the development of CoreWeave&#39;s Enterprise Data Platform within the CIO organization. This role will contribute to building a scalable, high-performance data lake and data architecture, integrating data from key sources across Operations, Engineering, Sales, Finance, and other IT partners.</p>
<p>As a Senior Product Manager for Data Infrastructure and Analytics, you will help drive data ingestion, transformation, governance, and analytics enablement. You will collaborate with engineering, analytics, finance, and business teams to help deliver data lake and pipeline orchestration solutions, ensuring accessible data for business insights.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Own and evangelize Data Platform and Business Analytics roadmap and strategy across CoreWeave</li>
<li>Assist with the execution of CoreWeave&#39;s enterprise data architecture, helping enable the data lake and domain-driven data layer</li>
<li>Support the development and enhancement of data ingestion, transformation, and orchestration pipelines for scalability, efficiency, and reliability</li>
<li>Work with the Engineering and Data teams to maintain and enhance data pipelines for both structured and unstructured data, enabling efficient data movement across the organization</li>
<li>Collaborate with Finance, GTM, Infrastructure, Data Center, and Supply Chain teams to help unify and model data from core systems (ERP, CRM, Asset Mgmt, Supply Chain systems, etc.)</li>
<li>Contribute to data governance and quality initiatives, focusing on data consistency, lineage tracking, and compliance with security standards</li>
<li>Support the BI and analytics layer by partnering with stakeholders to enable data products, dashboards, and reporting capabilities</li>
<li>Help prioritize data-driven initiatives, ensuring alignment with business goals and operational needs in coordination with leadership</li>
</ul>
<p>Requirements:</p>
<ul>
<li>5+ years of experience in data product management, data architecture, or enterprise data engineering roles</li>
<li>Familiarity with data lakes, data warehouses, ETL/ELT and streaming pipelines, and data governance frameworks</li>
<li>Hands-on experience with modern data stack technologies (such as Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka)</li>
<li>Understanding of data modeling, domain-driven design, and creating scalable data platforms</li>
<li>Experience supporting the end-to-end data product lifecycle, including requirements gathering and implementation</li>
<li>Strong collaboration skills with engineering, analytics, and business teams to help deliver data initiatives</li>
<li>Awareness of data security, compliance, and governance best practices</li>
<li>Understanding of BI and analytics platforms (such as Tableau, Looker, Power BI) and supporting self-service analytics</li>
</ul>
<p>Why CoreWeave?</p>
<p>At CoreWeave, we work hard, have fun, and move fast! We&#39;re in an exciting stage of hyper-growth that you will not want to miss out on. We&#39;re not afraid of a little chaos, and we&#39;re constantly learning. Our team cares deeply about how we build our product and how we work together, which is represented through our core values:</p>
<ul>
<li>Be Curious at Your Core</li>
<li>Act Like an Owner</li>
<li>Empower Employees</li>
<li>Deliver Best-in-Class Client Experiences</li>
<li>Achieve More Together</li>
</ul>
<p>We support and encourage an entrepreneurial outlook and independent thinking. We foster an environment that encourages collaboration and provides the opportunity to develop innovative solutions to complex problems. As we get set for take off, the growth opportunities within the organization are constantly expanding. You will be surrounded by some of the best talent in the industry, who will want to learn from you, too. Come join us!</p>
<p>Salary Range: $143,000 to $210,000</p>
<p>Benefits:</p>
<ul>
<li>Medical, dental, and vision insurance - 100% paid for by CoreWeave</li>
<li>Company-paid Life Insurance</li>
<li>Voluntary supplemental life insurance</li>
<li>Short and long-term disability insurance</li>
<li>Flexible Spending Account</li>
<li>Health Savings Account</li>
<li>Tuition Reimbursement</li>
<li>Ability to Participate in Employee Stock Purchase Program (ESPP)</li>
<li>Mental Wellness Benefits through Spring Health</li>
<li>Family-Forming support provided by Carrot</li>
<li>Paid Parental Leave</li>
<li>Flexible, full-service childcare support with Kinside</li>
<li>401(k) with a generous employer match</li>
<li>Flexible PTO</li>
<li>Catered lunch each day in our office and data center locations</li>
<li>A casual work environment</li>
<li>A work culture focused on innovative disruption</li>
</ul>
<p>Workplace:</p>
<p>While we prioritize a hybrid work environment, remote work may be considered for candidates located more than 30 miles from an office, based on role requirements for specialized skill sets. New hires will be invited to attend onboarding at one of our hubs within their first month. Teams also gather quarterly to support collaboration.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$143,000 to $210,000</Salaryrange>
      <Skills>data product management, data architecture, enterprise data engineering, data lakes, data warehouses, ETL/ELT and streaming pipelines, data governance frameworks, modern data stack technologies, Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka, data modeling, domain-driven design, scalable data platforms, BI and analytics platforms, Tableau, Looker, Power BI</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>CoreWeave</Employername>
      <Employerlogo>https://logos.yubhub.co/coreweave.com.png</Employerlogo>
      <Employerdescription>CoreWeave is a cloud-based platform that enables innovators to build and scale AI with confidence.</Employerdescription>
      <Employerwebsite>https://www.coreweave.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/coreweave/jobs/4649824006</Applyto>
      <Location>Livingston, NJ / New York, NY / Sunnyvale, CA / Bellevue, WA/San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3168d7d3-70b</externalid>
      <Title>Partner Solutions Architect - North America</Title>
      <Description><![CDATA[<p>About Us</p>
<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across North America. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>
<p>As a Partner Solutions Architect, you will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud. Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing.</p>
<p>Responsibilities</p>
<ul>
<li>Partner closely with North America Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>
<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>
<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>
<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>
<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>
<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>
<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>
<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>
<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>
<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>
<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>
</ul>
<p>Requirements</p>
<ul>
<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>
<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>
<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>
<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>
<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>
<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>
<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>
<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>
<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>
<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>
</ul>
<p>What will make you stand out</p>
<ul>
<li>Experience working directly in partner, alliance, or ecosystem roles</li>
<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>
<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>
<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>
<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>
<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>
<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>
<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>
</ul>
<p>Benefits</p>
<ul>
<li>Unlimited vacation (and yes we use it!)</li>
<li>Pension coverage</li>
<li>Excellent healthcare</li>
<li>Paid Parental Leave</li>
<li>Wellness stipend</li>
<li>Home office stipend, and more!</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner development, field engineering, sales engineering, consulting, partner engineering, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>dbt Labs</Employername>
      <Employerlogo>https://logos.yubhub.co/getdbt.com.png</Employerlogo>
      <Employerdescription>dbt Labs is a software company that provides an analytics engineering platform used by over 90,000 teams every week, driving data transformations and AI use cases. As of February 2025, they have surpassed $100 million in annual recurring revenue (ARR) and serve more than 5,400 dbt Platform customers.</Employerdescription>
      <Employerwebsite>https://www.getdbt.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673630005</Applyto>
      <Location>Canada - Remote; US - Remote</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>26f523c0-bbd</externalid>
      <Title>Resident Solutions Architect - Manufacturing</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect (RSA) on our Professional Services team, you will work with customers on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>The impact you will have:</p>
<ul>
<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues</li>
</ul>
<ul>
<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines</li>
</ul>
<ul>
<li>Documentation and white-boarding skills</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
</ul>
<ul>
<li>Ability to travel up to 30% when needed</li>
</ul>
<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494154002</Applyto>
      <Location>Boston, Massachusetts</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>2c0fc802-cf3</externalid>
      <Title>Staff Product Manager, AI Platform</Title>
      <Description><![CDATA[<p>At Databricks, we are building the world&#39;s best data and AI infrastructure platform. As a Staff Product Manager on the AI Platform team, you will drive the vision and roadmap for AI platform product areas and define how customers build, train, deploy, and monitor AI and ML systems on Databricks.</p>
<p>You will own the product roadmap for AI platform areas, defining what we build, why, and in what order, to accelerate customer adoption of AI and ML in production. You will drive strategy for key AI platform capabilities, shaping how enterprises operationalize AI at scale.</p>
<p>You will partner closely with engineering teams to make deeply technical decisions about ML infrastructure, from distributed training architectures to real-time serving systems. You will represent the voice of the customer by engaging directly with enterprise ML teams, translating their pain points and workflows into platform capabilities that simplify the path to production AI.</p>
<p>You will collaborate with GTM, Solutions Architecture, and Customer Success teams to drive enterprise adoption, shape field enablement, and inform competitive positioning. You will define pricing, packaging, and commercialization strategy for AI platform features, working with business teams to maximize value capture.</p>
<p>You will grow end-user engagement with Databricks AI tools by identifying adoption bottlenecks and partnering cross-functionally to remove them.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$181,700-$249,800 USD</Salaryrange>
      <Skills>Deep technical background in computer science, electrical engineering, or equivalent degree, Experience with ML/AI infrastructure, data platforms, or cloud services, Proven enterprise B2B product management experience with highly technical customers, Ability to engage credibly with world-class ML engineers, Familiarity with recommendation systems</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data and AI workloads. It was founded by the original creators of Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8420609002</Applyto>
      <Location>San Francisco, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0ff568ca-d59</externalid>
      <Title>Senior Software Engineer - Data Infrastructure Services</Title>
      <Description><![CDATA[<p>CoreWeave is seeking a senior software engineer to join its Data Platforms Team. The ideal candidate will have experience in database and stream processing, and will be responsible for designing and implementing the platform to deliver data to teams with a focus on providing managed solutions through APIs.</p>
<p>The successful candidate will participate in operations and scaling of relational data platforms, develop a stream processing architecture, and improve the performance, security, reliability, and scalability of our data platforms and related services. They will also establish guidelines, guardrails for data access and storage for stakeholder teams, and ensure compliance with standards for data protection regulation.</p>
<p>In addition to technical skills, the ideal candidate will be able to grow, change, invest in their teammates, be invested-in, share their ideas, listen to others, be curious, have fun, and be themselves. CoreWeave values diversity and inclusion, and encourages candidates from all backgrounds to apply.</p>
<p>Key responsibilities:</p>
<ul>
<li>Design and implement the platform to deliver data to teams with a focus on providing managed solutions through APIs</li>
</ul>
<ul>
<li>Participate in operations and scaling of relational data platforms</li>
</ul>
<ul>
<li>Develop a stream processing architecture</li>
</ul>
<ul>
<li>Improve the performance, security, reliability, and scalability of our data platforms and related services</li>
</ul>
<ul>
<li>Establish guidelines, guardrails for data access and storage for stakeholder teams</li>
</ul>
<ul>
<li>Ensure compliance with standards for data protection regulation</li>
</ul>
<ul>
<li>Grow, change, invest in your teammates, be invested-in, share your ideas, listen to others, be curious, have fun, and be yourself</li>
</ul>
<p>Requirements:</p>
<ul>
<li>5+ years of experience in a software or infrastructure engineering industry</li>
</ul>
<ul>
<li>Experience operating services in production and at scale</li>
</ul>
<ul>
<li>Familiarity with one of the distributed NewSQL datastores such as CockroachDB, TiDB, YDB, Yugabyte and/or stream processing tools such as NATS or Kafka</li>
</ul>
<ul>
<li>Experience with designing and operating these systems at scale</li>
</ul>
<ul>
<li>Familiarity with Kubernetes and have interest or comfortable with using it for event-driven and/or stateful orchestration</li>
</ul>
<ul>
<li>Proficiency in Go/Python/Java and interested in contributing to open source</li>
</ul>
<p>ExperienceLevel: senior</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel></Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$165,000 to $242,000</Salaryrange>
      <Skills>database and stream processing, API design and implementation, operational and scaling of relational data platforms, stream processing architecture, performance, security, reliability, and scalability of data platforms, data access and storage guidelines, data protection regulation compliance, Kubernetes, Go/Python/Java, open source contribution</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>CoreWeave</Employername>
      <Employerlogo>https://logos.yubhub.co/coreweave.com.png</Employerlogo>
      <Employerdescription>CoreWeave is a cloud computing company that provides a platform for building and scaling AI applications.</Employerdescription>
      <Employerwebsite>https://www.coreweave.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/coreweave/jobs/4671479006</Applyto>
      <Location>Sunnyvale, CA / Bellevue, WA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3d57b93e-423</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have:</p>
<ul>
<li>Databricks Certification</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, data architecture, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456948002</Applyto>
      <Location>Atlanta, Georgia</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c7ba4251-36b</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>Job Title: Resident Solutions Architect - Public Sector</p>
<p>We are seeking a highly skilled Resident Solutions Architect to join our Professional Services team in Washington, D.C. As a Resident Solutions Architect, you will work with customers on short to medium-term customer engagements on their big data challenges using the Databricks platform.</p>
<p>Responsibilities:</p>
<ul>
<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>
<li>Provide an escalated level of support for customer operational issues</li>
<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>Requirements:</p>
<ul>
<li>US Top Secret Clearance Required this position</li>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines</li>
<li>Documentation and white-boarding skills</li>
<li>Experience working with clients and managing conflicts</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
<li>Bachelor&#39;s degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience</li>
<li>Ability to travel up to 30% when needed</li>
</ul>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD Zone 2 Pay Range $180,656-$248,360 USD Zone 3 Pay Range $180,656-$248,360 USD Zone 4 Pay Range $180,656-$248,360 USD</p>
<p>About Databricks</p>
<p>Databricks is the data and AI company. More than 10,000 organizations worldwide , including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 , rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.</p>
<p>Benefits</p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.</p>
<p>Our Commitment to Diversity and Inclusion</p>
<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.</p>
<p>Compliance</p>
<p>If access to export-controlled technology or source code is required for performance of job duties, it is within Employer&#39;s discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis aloneabled</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, scope and timelines, documentation and white-boarding, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8356289002</Applyto>
      <Location>Washington, D.C.</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>d858a729-7fc</externalid>
      <Title>Senior Manager, Finance and Accounting Systems</Title>
      <Description><![CDATA[<p>You will lead the Finance Systems team responsible for architecting, scaling, and supporting enterprise platforms across GL, Revenue, Billing, Fixed Assets, Inventory, Lease Accounting. You&#39;ll partner closely with IT Product management, Finance and accounting, RevOps, Integration and Data Platforms to drive system implementation, support, compliance, automation, and integration.</p>
<p>In this role, you will own end-to-end Finance &amp; accounting system architecture spanning key business processes around GL, AP, AR, ARM, Fixed Assets, Leases and inventory accounting. You will act as the primary systems partner for Finance stakeholders, including Revenue Accounting, Operations Accounting, Tax, Fixed Assets, Treasury, GL and AP teams, while working with IT Product Management on executing the roadmap across modules aligned with business priorities.</p>
<p>You will lead a team of BSEs and administrators to deliver scalable solutions aligned to business goals, implement and execute audit ready ITGC &amp; ITAC controls for SOX Compliance, and partner with IT integration teams on cross-functional workflows and AI agents for process automation and optimization. You will also provide leadership for Finance systems admin support, GL, and operational excellence.</p>
<p>The base salary range for this role is $207,000 to $250,000. The starting salary will be determined based on job-related knowledge, skills, experience, and market location. We strive for both market alignment and internal equity when determining compensation. In addition to base salary, our total rewards package includes a discretionary bonus, equity awards, and a comprehensive benefits program (all based on eligibility).</p>
<p>The range we’ve posted represents the typical compensation range for this role. To determine actual compensation, we review the market rate for each candidate which can include a variety of factors. These include qualifications, experience, interview performance, and location.</p>
<p>In addition to a competitive salary, we offer a variety of benefits to support your needs, including:</p>
<ul>
<li>Medical, dental, and vision insurance - 100% paid for by CoreWeave</li>
</ul>
<ul>
<li>Company-paid Life Insurance</li>
</ul>
<ul>
<li>Voluntary supplemental life insurance</li>
</ul>
<ul>
<li>Short and long-term disability insurance</li>
</ul>
<ul>
<li>Flexible Spending Account</li>
</ul>
<ul>
<li>Health Savings Account</li>
</ul>
<ul>
<li>Tuition Reimbursement</li>
</ul>
<ul>
<li>Ability to Participate in Employee Stock Purchase Program (ESPP)</li>
</ul>
<ul>
<li>Mental Wellness Benefits through Spring Health</li>
</ul>
<ul>
<li>Family-Forming support provided by Carrot</li>
</ul>
<ul>
<li>Paid Parental Leave</li>
</ul>
<ul>
<li>Flexible, full-service childcare support with Kinside</li>
</ul>
<ul>
<li>401(k) with a generous employer match</li>
</ul>
<ul>
<li>Flexible PTO</li>
</ul>
<ul>
<li>Catered lunch each day in our office and data center locations</li>
</ul>
<ul>
<li>A casual work environment</li>
</ul>
<ul>
<li>A work culture focused on innovative disruption</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$207,000 to $250,000</Salaryrange>
      <Skills>ERP, finance systems leadership, technical teams management, finance processes, RevRec, GL, Fixed Assets, Tax, Inventory accounting management, Lease Accounting, treasury and Payments, Netsuite OneWorld, Advanced Financials, ARM, Multibook, Bank Payments, Squareworks AP Automation, Netgain - Netloan/NetAssets, Salesforce, Coupa, Kyriba, data platforms, SOX controls, auditors, AI use for addressing use cases in Finance &amp; accounting, Ramp, Navan, Costar, FloQast, Auditboard, Workiva</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>CoreWeave</Employername>
      <Employerlogo>https://logos.yubhub.co/coreweave.com.png</Employerlogo>
      <Employerdescription>CoreWeave delivers a platform of technology, tools, and teams that enables innovators to build and scale AI with confidence. It became a publicly traded company in March 2025.</Employerdescription>
      <Employerwebsite>https://www.coreweave.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/coreweave/jobs/4655657006</Applyto>
      <Location>Sunnyvale, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>e7613e05-073</externalid>
      <Title>Customer Enablement Specialist</Title>
      <Description><![CDATA[<p>Job Title: Customer Enablement Specialist</p>
<p>Location: Bellevue, Washington</p>
<p>Department: Education &amp; Training</p>
<p>CSQ227R234</p>
<p><strong>About the Role</strong></p>
<p>This role is required to work in a hybrid office setting in our Bellevue, WA office.</p>
<p><strong>The Opportunity</strong></p>
<p>Databricks runs some of the largest customer enablement programs in the industry , workshops, digital courses, labs, and webinars that reach thousands of users. The Customer Enablement Specialist turns that reach into results. You connect engaged learners to structured training plans that drive product adoption, customer success, and measurable business impact.</p>
<p>This isn’t a sales or business development role , every conversation begins with an existing Databricks user or program participant. Your focus is on helping those customers move from initial interest to tangible capability: skilled teams, completed training milestones, and activated use cases.</p>
<p>You’ll manage a broad portfolio of accounts, supporting new and emerging personas , business users, analysts, and app developers , and helping them succeed with Databricks’ latest innovations in AI/BI, Databricks Apps, and agent-based development.</p>
<p><strong>What You&#39;ll Do</strong></p>
<ul>
<li>Convert participation in Databricks’ scale programs (webinars, workshops, digital learning) into structured training engagements.</li>
</ul>
<ul>
<li>Own a high-volume enablement pipeline , identifying learner needs, recommending tailored paths, and tracking adoption progress.</li>
</ul>
<ul>
<li>Deliver engaging L100–L200 sessions and demos to help new personas understand what’s possible with Databricks.</li>
</ul>
<ul>
<li>Build enablement plans for each account, tracking trained users, completion rates, and milestone achievement.</li>
</ul>
<ul>
<li>Partner with Customer Success Managers (CSMs), Account Executives (AEs), and senior CEAs to align training with customer goals and renewal cycles.</li>
</ul>
<ul>
<li>Report key metrics , trained accounts, learner growth, conversion rates, and training revenue , using data to guide your priorities.</li>
</ul>
<ul>
<li>Provide structured feedback to program and curriculum teams to sharpen future customer learning experiences.</li>
</ul>
<p><strong>What You Bring</strong></p>
<ul>
<li>2–4 years in a technical, customer-facing role , technical training, pre-sales, enablement, or customer success preferred.</li>
</ul>
<ul>
<li>Hands-on familiarity with modern data and analytics platforms (Databricks, cloud SQL, BI tools, or data lakes).</li>
</ul>
<ul>
<li>Confidence delivering introductory technical content to non-expert audiences.</li>
</ul>
<ul>
<li>Working knowledge of AI/ML concepts , able to explain how Databricks enables practical use cases.</li>
</ul>
<ul>
<li>Strong communication skills and a consultative approach: discover needs, recommend paths, and gain commitment.</li>
</ul>
<ul>
<li>A data-driven mindset with strong organisational habits and comfort managing many concurrent accounts.</li>
</ul>
<ul>
<li>Team-first attitude , proactive collaborator who knows when to escalate for deeper technical support.</li>
</ul>
<p><strong>Bonus Points</strong></p>
<ul>
<li>Databricks certifications or willingness to certify (Data Engineer Associate, Databricks certifications (or willingness to obtain within 6 months).</li>
</ul>
<ul>
<li>Background in SaaS, cloud, or data platforms; familiarity with BI or AI/BI tools (Databricks Genie, Tableau, Power BI).</li>
</ul>
<ul>
<li>Exposure to Databricks Apps, REST APIs, or AI agent concepts.</li>
</ul>
<ul>
<li>Experience in a role with enablement or training-related revenue metrics.</li>
</ul>
<p><strong>Why This Role, Why Now</strong></p>
<p>New products create new skill gaps. As Databricks expands into AI/BI, Databricks Apps, and agent-based development, a new wave of users , business analysts, app builders, domain experts , needs to get skilled up quickly. The depth CEA team focuses on the complex, strategic, and deeply technical. This role focuses on the broad middle: high volume, new personas, and the scale-to-commitment motion that turns digital participation into real adoption. It is a high-visibility, high-impact position with a clear growth path into senior CEA work as you build depth and track record.</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipates utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>
<p>Zone 2 Pay Range $86,600-$119,150 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$86,600-$119,150 USD</Salaryrange>
      <Skills>data and analytics platforms, cloud SQL, BI tools, data lakes, AI/ML concepts, Databricks Apps, REST APIs, AI agent concepts, Databricks certifications, SaaS, cloud, data platforms, BI or AI/BI tools</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8431935002</Applyto>
      <Location>Bellevue, Washington</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>cbd81d47-d7e</externalid>
      <Title>Data Platform Solutions Architect (Professional Services)</Title>
      <Description><![CDATA[<p>We&#39;re hiring for multiple roles within our Professional Services team. This position may be offered as Senior Solutions Consultant, Resident Solutions Architect, or Senior Resident Solutions Architect. The final title will align to your experience, technical depth, and customer-facing ownership.</p>
<p>As a Big Data Solutions Architect (Internal Title - Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Extensive experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 10% of the time</li>
</ul>
<p>[Preferred] Databricks Certification but not essential</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8486738002</Applyto>
      <Location>London, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8efd6b3b-251</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456973002</Applyto>
      <Location>Boston, Massachusetts</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6d94d7ea-9ca</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
</ul>
<ul>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have: Databricks Certification</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, design and deployment of highly performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461330002</Applyto>
      <Location>Washington, D.C.</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3d22e39a-bde</externalid>
      <Title>Data Analyst II</Title>
      <Description><![CDATA[<p>Why join us</p>
<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>
<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>
<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>
<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>
<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>
<p>Data at Brex</p>
<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>
<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>
<p>What you’ll do</p>
<p>As a Data Analyst II (DA), you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>
<p>You will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses.</p>
<p>This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>
<p>Where you’ll work</p>
<p>This role will be based in our San Francisco office.</p>
<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>
<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>
<p>As a perk, we also have up to four weeks per year of fully remote work!</p>
<p>Responsibilities</p>
<p>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</p>
<p>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</p>
<p>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</p>
<p>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</p>
<p>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</p>
<p>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</p>
<p>Contribute to the automation of recurring analyses and reporting workflows using Python.</p>
<p>Requirements</p>
<p>3+ years of experience in data analytics or a related role in a professional setting.</p>
<p>2+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</p>
<p>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</p>
<p>Experience with Python for data analysis, automation, or scripting.</p>
<p>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</p>
<p>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</p>
<p>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</p>
<p>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</p>
<p>Bonus points</p>
<p>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</p>
<p>Familiarity with dbt for data modeling and transformation.</p>
<p>Exposure to data pipeline orchestration tools (e.g., Airflow).</p>
<p>Experience in fintech, financial services, or payments.</p>
<p>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</p>
<p>Compensation</p>
<p>The expected salary range for this role is $93,600 - $117,000.</p>
<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>
<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$93,600 - $117,000</Salaryrange>
      <Skills>SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, Financial services, Payments</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is an intelligent finance platform that enables companies to spend smarter and move faster in over 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8463696002</Applyto>
      <Location>San Francisco, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8aca0e87-abb</externalid>
      <Title>Strategic AI/BI Account Executive</Title>
      <Description><![CDATA[<p>We are seeking a Strategic AI/BI Account Executive to help enterprise customers transform how business users interact with data. This high-impact role sits within the AI Go-To-Market team and partners closely with Enterprise Account Executives to drive adoption of Databricks AI/BI and Genie.</p>
<p>You will help organizations move beyond static dashboards to governed, conversational, AI-powered analytics at the center of the convergence of business intelligence, data platforms, and generative AI.</p>
<p>Enterprise analytics is rapidly evolving from dashboards and static reporting to conversational, AI-driven decision platforms. Databricks AI/BI and Genie empower business users to securely interact with governed data using natural language, transforming the data platform into a true decision platform.</p>
<p>If you want to be at the forefront of AI-powered analytics transformation at one of the fastest-growing data and AI companies in the world, this is your opportunity.</p>
<p>The impact you will have:</p>
<ul>
<li>Partner with Enterprise AEs to identify, qualify, and close AI/BI opportunities</li>
<li>Engage C-level, analytics, and line-of-business leaders to modernize analytics strategies</li>
<li>Displace or expand legacy BI platforms with AI-powered, governed analytics solutions</li>
<li>Lead conversations around semantic governance, self-service analytics, and natural language data access</li>
<li>Drive proof-of-value engagements and scale enterprise-wide adoption</li>
<li>Align AI/BI initiatives to measurable business outcomes (productivity, speed to insight, revenue impact)</li>
<li>Enable field teams and serve as a subject matter expert on modern analytics architectures</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Enterprise sales experience in BI, analytics, data platforms, or AI/ML</li>
<li>Strong understanding of modern analytics architectures and data governance</li>
<li>Ability to sell to both technical and business stakeholders</li>
<li>Executive presence and experience navigating complex buying cycles</li>
<li>Passion for AI and the impact of GenAI on enterprise analytics</li>
<li>Experience operating in a specialist or overlay sales model</li>
<li>Ability to translate technical capabilities into clear business value</li>
<li>7+ years of Enterprise Sales experience, exceeding quotas in larger accounts</li>
<li>Bachelors Degree or equivalent experience</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise sales experience in BI, analytics, data platforms, or AI/ML, Strong understanding of modern analytics architectures and data governance, Ability to sell to both technical and business stakeholders, Executive presence and experience navigating complex buying cycles, Passion for AI and the impact of GenAI on enterprise analytics</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform. It has over 10,000 organizations worldwide as clients.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8441888002</Applyto>
      <Location>London, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>43ca4523-d1d</externalid>
      <Title>Customer Enablement Specialist</Title>
      <Description><![CDATA[<p>As a Customer Enablement Specialist at Databricks, you will play a critical role in helping customers succeed with our data and AI platform. You will be responsible for managing a broad portfolio of accounts, supporting new and emerging personas, and helping them succeed with Databricks&#39; latest innovations in AI/BI, Databricks Apps, and agent-based development.</p>
<p>Your main objective will be to convert participation in Databricks&#39; scale programs (webinars, workshops, digital learning) into structured training engagements. You will own a high-volume enablement pipeline, identifying learner needs, recommending tailored paths, and tracking adoption progress.</p>
<p>To achieve this, you will deliver engaging L100–L200 sessions and demos to help new personas understand what&#39;s possible with Databricks. You will build enablement plans for each account, tracking trained users, completion rates, and milestone achievement.</p>
<p>You will partner with Customer Success Managers (CSMs), Account Executives (AEs), and senior CEAs to align training with customer goals and renewal cycles. You will report key metrics – trained accounts, learner growth, conversion rates, and training revenue – using data to guide your priorities.</p>
<p>In addition, you will provide structured feedback to program and curriculum teams to sharpen future customer learning experiences.</p>
<p>We are looking for a highly motivated and organized individual with excellent communication skills and a consultative approach. You should have hands-on familiarity with modern data and analytics platforms, confidence delivering introductory technical content to non-expert audiences, and a working knowledge of AI/ML concepts.</p>
<p>If you are passionate about helping customers succeed and have a strong desire to learn and grow with our company, we encourage you to apply for this exciting opportunity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$86,600-$119,150 USD</Salaryrange>
      <Skills>modern data and analytics platforms, customer-facing role, technical training, pre-sales, enablement, customer success, AI/BI, Databricks Apps, agent-based development, Databricks certifications, SaaS, cloud, data platforms, BI or AI/BI tools, Databricks Genie, Tableau, Power BI</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8431927002</Applyto>
      <Location>Bellevue, Washington</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>fd64db3e-49f</externalid>
      <Title>Staff Software Engineer – Customer Experience Intelligence (CXI)</Title>
      <Description><![CDATA[<p>At Databricks, we&#39;re shaping the future of how customers experience support at scale. As the Staff Technical Lead for Customer Experience Intelligence, you&#39;ll design intelligent, AI-powered systems that make support faster, smarter, and more effortless.</p>
<p>In this role, you&#39;ll have end-to-end ownership of the architecture and technical strategy behind automation and agentic workflows that reduce mean time to mitigate (MTTM), boost quality, and enable our Support organization to scale impact without scaling headcount. You&#39;ll work hands-on with teams across Support, Product, and Platform Engineering to build seamless systems that anticipate customer needs before they arise.</p>
<p>You&#39;ll lead the technical foundation that transforms how customers experience support , where issues are auto-diagnosed, solutions are delivered instantly, and engineers focus their time on the toughest challenges. Your success will mean customers moving faster, trusting Databricks deeper, and feeling the impact of your systems every day.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Owning the technical vision and architecture for Databricks&#39; Support Automation and Tooling ecosystem</li>
<li>Leading hands-on development of automation to improve customer experience and Support scalability</li>
<li>Driving rapid, iterative development while upholding quality, safety, and reliability standards</li>
<li>Designing agentic workflows that evolve from human-in-the-loop to fully automated systems</li>
<li>Implementing observability, transparency, and rollback mechanisms for AI-driven decisions</li>
<li>Acting as the primary technical interface between Support, Product, and Platform Engineering to align technical roadmaps and unblock dependencies</li>
<li>Setting a high engineering bar for quality, reliability, and maintainability in line with Databricks standards</li>
<li>Mentoring engineers and SMEs across Software and Support Engineering functions</li>
</ul>
<p>We&#39;re looking for someone with:</p>
<ul>
<li>A BS or higher degree in Computer Science or a related field</li>
<li>Technical leadership experience in large projects similar to those described, including automation tooling, distributed systems, and APIs</li>
<li>Extensive full-stack development experience</li>
<li>Proven success designing and deploying production-grade automation in complex technical environments</li>
<li>Hands-on experience with ML-assisted systems, decision support, or agentic automation</li>
<li>Deep familiarity with distributed data platforms, developer tooling, and large-scale infrastructure systems</li>
<li>Understanding of multi-cloud environments (AWS, Azure, GCP), compliance, and security constraints</li>
</ul>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range for this role is $190,000-$261,250 USD.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$190,000-$261,250 USD</Salaryrange>
      <Skills>Automation tooling, Distributed systems, APIs, Full-stack development, ML-assisted systems, Decision support, Agentic automation, Distributed data platforms, Developer tooling, Large-scale infrastructure systems, Multi-cloud environments, Compliance, Security constraints</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks builds and operates the world&apos;s best data and AI infrastructure platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8416959002</Applyto>
      <Location>Mountain View, California; San Francisco, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b96ce52c-eaa</externalid>
      <Title>Engineering Manager, Onboarding</Title>
      <Description><![CDATA[<p>Why join us</p>
<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>
<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>
<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>
<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>
<p><strong>Engineering</strong></p>
<p>Engineering at Brex is about building systems that scale with speed and intention. Our teams span Software, Data, Security, and IT, and operate with high autonomy and deep collaboration. We tackle hard technical problems, own our outcomes, and push for excellence at every level , from architecture to deployment.</p>
<p>It’s an environment where engineering is a craft, and builders become leaders.</p>
<p><strong>What you’ll do</strong></p>
<p>You will lead an engineering team focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, and integrations workflows that help customers realize value quickly.</p>
<p>This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>
<p>The ideal candidate is an engineering leader with experience scaling user-facing onboarding systems, delivering high-quality product experiences, and partnering deeply across Product, Design, Operations, and GTM teams.</p>
<p><strong>Where you’ll work</strong></p>
<p>This role will be based in our San Francisco office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday.</p>
<p>As a perk, we also have up to four weeks per year of fully remote work!</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding experience.</li>
</ul>
<ul>
<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>
</ul>
<ul>
<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>
</ul>
<ul>
<li>Drive execution of the Onboarding roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>
</ul>
<ul>
<li>Lead and manage a team of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>
</ul>
<ul>
<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>
</ul>
<ul>
<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>
</ul>
<ul>
<li>6+ years of software engineering experience with strong technical depth.</li>
</ul>
<ul>
<li>3+ years of experience managing or leading engineers in a high-growth environment.</li>
</ul>
<ul>
<li>Strong technical background and understanding of software development principles.</li>
</ul>
<ul>
<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>
</ul>
<ul>
<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>
</ul>
<ul>
<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>
</ul>
<ul>
<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>
</ul>
<p><strong>Bonus points</strong></p>
<ul>
<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>
</ul>
<ul>
<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer-facing experiences.</li>
</ul>
<ul>
<li>You have started your own technology venture or were an early technical founder/employee. We value entrepreneurial spirit &amp; scrappiness!</li>
</ul>
<p><strong>Compensation</strong></p>
<p>The expected salary range for this role is $240,000 - $300,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$240,000 - $300,000</Salaryrange>
      <Skills>Software engineering, Technical leadership, AI-powered product experiences, Data-driven mindset, Cross-functional collaboration, Full-stack engineering, Data platforms, Workflow automation, Customer lifecycle products, LLM-driven automation, Personalization</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8461600002</Applyto>
      <Location>San Francisco, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8664b981-66c</externalid>
      <Title>Data Platform Solutions Architect (Professional Services) - Emerging Enterprise &amp; DNB</Title>
      <Description><![CDATA[<p>We&#39;re hiring for multiple roles within our Professional Services team. Depending on experience and scope, this position may be offered as a Senior Solutions Consultant or a Resident Solutions Architect. You may know this role as a Big Data Solutions Architect, Analytics Architect, Data Platform Architect, or Technical Consultant. The final title will align to your experience, technical depth, and customer-facing ownership.</p>
<p>As a Data Platform Solutions Architect on our Professional Services team for the Emerging Enterprise &amp; Digital Natives business in EMEA, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Drive high-impact customer projects: Design and build reference architectures, implement production use cases, and create “how-to” guides tailored to the unique needs of fast-moving Emerging Enterprise &amp; Digital Native customers in EMEA.</li>
</ul>
<ul>
<li>Collaborate on project scoping: Work closely with Engagement Managers and customers to define project scope, schedules, and deliverables for professional services engagements.</li>
</ul>
<ul>
<li>Enable transformational initiatives: Guide strategic customers through their end-to-end big data journeys,migrating from legacy platforms and deploying industry-leading data and AI applications on the Databricks platform.</li>
</ul>
<ul>
<li>Consult on architecture &amp; design: Provide thought leadership on solution design and implementation strategies, ensuring customers can successfully evaluate and adopt Databricks.</li>
</ul>
<ul>
<li>Offer advanced support: Serve as an escalation point for operational issues, collaborating with Databricks Support and Engineering to resolve challenges quickly.</li>
</ul>
<ul>
<li>Align technical delivery: Partner with cross-functional Databricks teams (Technical, PM, Architecture, and Customer Success) to align on milestones, ensuring customer needs and deadlines are met.</li>
</ul>
<ul>
<li>Amplify product feedback: Provide implementation insights to Databricks Product and Support teams, guiding rapid improvements in features and troubleshooting for customers.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Extensive experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 10% of the time</li>
</ul>
<ul>
<li>[Preferred] Databricks Certification but not essential</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8439047002</Applyto>
      <Location>London, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>72260a13-6ce</externalid>
      <Title>Web Analytics &amp; Insights Manager</Title>
      <Description><![CDATA[<p>Join the team as Twilio&#39;s next Web Analytics &amp; Insights Manager.</p>
<p>This position is needed to partner in driving a scalable, holistic web analytics implementation strategy while maintaining data integrity of existing setup, ultimately ensuring Twilio.com web tracking capabilities continue to grow, are utilized by the company, and are accurate + reliable. The ideal candidate collaborates with several cross-functional teams at Twilio such as UX, CRO, Web Strategy, Web Dev, Demand Gen, SEO, Content+ more to deliver scalable and easy-to-use web analytics solutions and insights.</p>
<p>As a subject matter expert in this space, you will grow your presence company-wide and eventually own relationships cross-functionally.</p>
<p>We are looking for someone with proven experience in GA4 implementation, business/marketing analytics reporting, analysis and stakeholder communications.</p>
<p>Responsibilities:</p>
<p>Work closely with the Web team to deliver desired web analytics tracking for business stakeholders Own and maintain a holistic GA/web analytics strategy, implementation, troubleshooting, documentation + enablement Brainstorm with technical and non-technical stakeholders on creative ways to meet data requirements, while balancing site performance, holistic tagging strategies, and expanded data capabilities Drive continual web analytics capability improvements, and maintain high standards for data integrity Own and share recurring monthly and adhoc analysis, deep dives, storytelling to uncover the &#39;so what&#39; - providing actionable insights and recommendations across site, page, channel and A/B testing performance Strategize and build scalable solutions that support company-wide enablement and the utilization of web analytics for data-driven decision-making Partner with our CDP SMEs to orchestrate a unified data strategy, maximizing the integration between our CDP and Google Analytics to drive a 360-degree view of the customer journey</p>
<p>Qualifications:</p>
<ul>
<li>5 years of experience in web analytics</li>
<li>Expertise in Google Analytics 4, both implementation and analysis.</li>
<li>Strong knowledge of JavaScript, HTML, and browser dev tools.</li>
<li>Hands-on experience with Google Tag Manager (GTM)</li>
<li>Familiarity with A/B and multivariate testing frameworks, as well as web personalization</li>
<li>Strong communication skills with the ability to communicate effectively with both technical and non-technical audiences</li>
<li>Detail-oriented with a passion for clean, trustworthy data.</li>
<li>Strong analytical skills with a track record of independently exploring data, delivering insights, storytelling, and actionable recommendations.</li>
<li>Excellent collaborator across cross functional teams; skilled in managing stakeholder projects</li>
<li>Highly reliable and accountable.</li>
<li>A self-starter.</li>
<li>Interest or experience in applying AI-driven solutions to analytics workflows.</li>
</ul>
<p>Desired:</p>
<ul>
<li>Familiarity with Big Query, Adobe AEM, Adobe Target</li>
<li>Interest or exposure to fraud monitoring tools and processes.</li>
<li>Experience with Looker Studio, Tableau, or other BI tools.</li>
<li>Working knowledge of Customer Data Platforms (CDPs) and analytics data integrations.</li>
<li>A deeper understanding of:</li>
</ul>
<p>First party and third party cookies Cross domain tracking (including the limitations of web protocols and UTM strategies) Tagging and tracking strategies (e.g., when to use Segment → GA4 vs. GTM → GA4 for optimal efficiency) Ad pixel setup and utms deployment strategies</p>
<p>Travel:</p>
<p>We prioritize connection and opportunities to build relationships with our customers and each other. For this role, approximately 15% travel is anticipated to help you connect in-person in a meaningful way.</p>
<p>What We Offer:</p>
<p>Working at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.</p>
<p>Compensation:</p>
<p>*Please note this role is open to candidates outside of California, Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, New Jersey, New York, Vermont, Washington D.C., and Washington State. The information below is provided for candidates hired in those locations only. The estimated pay ranges for this role are as follows:</p>
<p>Based in Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, Vermont or Washington D.C. : $106,320 - $132,900. Based in New York, New Jersey, Washington State, or California (outside of the San Francisco Bay area): $112,560 - $140,700. Based in the San Francisco Bay area, California: $125,040 - $156,300.</p>
<p>This role may be eligible to participate in Twilio’s equity plan and corporate bonus plan. All roles are generally eligible for the following benefits: health care insurance, 401(k) retirement account, paid sick time, paid personal time off, paid parental leave. The successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location.</p>
<p>Application deadline information:</p>
<p>Applications for this role are intended to be accepted until April 30, 2026, but may change based on business needs.</p>
<p>Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That&#39;s why we seek out colleagues who embody our values , something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you&#39;re ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn&#39;t what you&#39;re looking for, please consider other open positions.</p>
<p>Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel></Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Google Analytics 4, JavaScript, HTML, Browser dev tools, Google Tag Manager, A/B and multivariate testing frameworks, Web personalization, Communication skills, Analytical skills, Data analysis, Insights, Storytelling, Actionable recommendations, Big Query, Adobe AEM, Adobe Target, Fraud monitoring tools, Looker Studio, Tableau, Customer Data Platforms, Analytics data integrations</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Twilio</Employername>
      <Employerlogo>https://logos.yubhub.co/twilio.com.png</Employerlogo>
      <Employerdescription>Twilio is a cloud communication platform that provides a range of APIs and services for building, scaling, and operating real-time communication and collaboration applications.</Employerdescription>
      <Employerwebsite>https://www.twilio.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/twilio/jobs/7736718</Applyto>
      <Location>Remote - US</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>2afc821d-248</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494149002</Applyto>
      <Location>Philadelphia, Pennsylvania</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>34a0bf55-11a</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines</li>
<li>Documentation and white-boarding skills</li>
<li>Experience working with clients and managing conflicts</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461222002</Applyto>
      <Location>Boston, Massachusetts</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7d723067-22d</externalid>
      <Title>Resident Solutions Architect - Healthcare &amp; Life Sciences</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494144002</Applyto>
      <Location>Dallas, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7e5c6f46-bb6</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456975002</Applyto>
      <Location>Dallas, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b4a461d1-b6b</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a company that provides a data and AI platform. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494128002</Applyto>
      <Location>Washington, D.C.</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>32d8d11d-9dc</externalid>
      <Title>Resident Solutions Architect - Healthcare &amp; Life Sciences</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8371312002</Applyto>
      <Location>New York City, New York</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3e92e8a2-811</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494130002</Applyto>
      <Location>New York City, New York</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>4cd630c8-77d</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>Job Title: Resident Solutions Architect - Public Sector</p>
<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers get the most value out of their data.</p>
<p>Responsibilities:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope various professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build, and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which lead to a customer&#39;s successful understanding, evaluation, and adoption of Databricks</li>
<li>Provide an escalated level of support for customer operational issues</li>
<li>Work with the Databricks technical team, Project Manager, Architect, and Customer team to ensure the technical components of the engagement are delivered to meet customer needs</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement-specific product and support issues</li>
</ul>
<p>Requirements:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines</li>
<li>Documentation and white-boarding skills</li>
<li>Experience working with clients and managing conflicts</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Pay Range Transparency:</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range for this role is $180,656-$248,360 USD.</p>
<p>About Databricks:</p>
<p>Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics, and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</p>
<p>Benefits:</p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, click here.</p>
<p>Our Commitment to Diversity and Inclusion:</p>
<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation, white-boarding</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform. The company was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494137002</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>50d65da2-2e4</externalid>
      <Title>Resident Solutions Architect - Healthcare &amp; Life Sciences</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. Your responsibilities will include providing data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers get the most value out of their data.</p>
<p>You will work on a variety of impactful customer technical projects, including designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases. You will also work with engagement managers to scope professional services work with input from the customer, guide strategic customers as they implement transformational big data projects, and consult on architecture and design.</p>
<p>To be successful in this role, you will need to have 6+ years of experience in data engineering, data platforms, and analytics, be comfortable writing code in either Python or Scala, and have working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one. You should also have deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals.</p>
<p>The pay range for this role is $180,656-$248,360 USD, and the total compensation package may also include eligibility for annual performance bonus, equity, and benefits.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms, analytics, Python, Scala, Cloud ecosystems, Apache Spark, distributed computing</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data engineering, data science, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494143002</Applyto>
      <Location>Chicago, Illinois</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8d8b3af4-285</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494147002</Applyto>
      <Location>Atlanta, Georgia</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1d222227-15b</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>
<li>Provide an escalated level of support for customer operational issues</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines</li>
<li>Documentation and white-boarding skills</li>
<li>Experience working with clients and managing conflicts</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have:</p>
<ul>
<li>Databricks Certification</li>
</ul>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD Zone 2 Pay Range $180,656-$248,360 USD Zone 3 Pay Range $180,656-$248,360 USD Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456969002</Applyto>
      <Location>Chicago, Illinois</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6ac59b11-215</externalid>
      <Title>Engineering Manager, Onboarding</Title>
      <Description><![CDATA[<p>Why join us</p>
<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>
<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>
<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>
<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>
<p>What you’ll do</p>
<p>You will lead an engineering team focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, and integrations workflows that help customers realize value quickly.</p>
<p>This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>
<p>The ideal candidate is an engineering leader with experience scaling user-facing onboarding systems, delivering high-quality product experiences, and partnering deeply across Product, Design, Operations, and GTM teams.</p>
<p>Responsibilities</p>
<ul>
<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding experience.</li>
</ul>
<ul>
<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>
</ul>
<ul>
<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>
</ul>
<ul>
<li>Drive execution of the Onboarding roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>
</ul>
<ul>
<li>Lead and manage a team of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>
</ul>
<ul>
<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>
</ul>
<ul>
<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>
</ul>
<p>Requirements</p>
<ul>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>
</ul>
<ul>
<li>6+ years of software engineering experience with strong technical depth.</li>
</ul>
<ul>
<li>3+ years of experience managing or leading engineers in a high-growth environment.</li>
</ul>
<ul>
<li>Strong technical background and understanding of software development principles.</li>
</ul>
<ul>
<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>
</ul>
<ul>
<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>
</ul>
<ul>
<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>
</ul>
<ul>
<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>
</ul>
<p>Bonus points</p>
<ul>
<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>
</ul>
<ul>
<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer facing experiences.</li>
</ul>
<ul>
<li>You have started your own technology venture or were an early technical founder/employee.</li>
</ul>
<p>Experience Level: senior Employment Type: full-time Workplace Type: hybrid Category: Engineering Industry: Technology Salary Range: $240,000 CAD - $300,000 CAD Required Skills:</p>
<ul>
<li>Software engineering experience</li>
<li>Technical leadership</li>
<li>AI-powered product experiences</li>
<li>Data-driven mindset</li>
<li>Cross-functional collaboration</li>
</ul>
<p>Preferred Skills:</p>
<ul>
<li>Data platforms (Snowflake, Hex)</li>
<li>Onboarding, implementation, identity, workflow automation</li>
<li>Customer lifecycle products</li>
<li>Technology venture founding/early employee experience</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$240,000 CAD - $300,000 CAD</Salaryrange>
      <Skills>Software engineering experience, Technical leadership, AI-powered product experiences, Data-driven mindset, Cross-functional collaboration, Data platforms (Snowflake, Hex), Onboarding, implementation, identity, workflow automation, Customer lifecycle products, Technology venture founding/early employee experience</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8461597002</Applyto>
      <Location>Vancouver, British Columbia, Canada</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6fed2bb6-3b6</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>Your responsibilities will include:</p>
<ul>
<li>Designing and building reference architectures for customers</li>
<li>Creating how-to&#39;s and productionalizing customer use cases</li>
<li>Guiding strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consulting on architecture and design; bootstrapping or implementing customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>
<li>Providing an escalated level of support for customer operational issues</li>
<li>Working with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
<li>Working with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>To be successful in this role, you will need:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines</li>
<li>Documentation and white-boarding skills</li>
<li>Experience working with clients and managing conflicts</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>The pay range for this role is $180,656-$248,360 USD per year, depending on location and experience.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD per year</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461321002</Applyto>
      <Location>Chicago, Illinois</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>32a63583-c9b</externalid>
      <Title>Sr. Analyst, Strategic Measurement</Title>
      <Description><![CDATA[<p>At Twilio, we&#39;re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences.\n\nOur dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day.\n\nAs we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding.\n\nJoin the team as Twilio’s next Sr. Analyst, Strategic Measurement.\n\nAbout the job\n\nWe are actively recruiting for this role to fill an existing vacancy. This position is needed to serve as the &quot;New Frontiers&quot; lead for the Marketing Strategy, Analytics &amp; Innovation team, focusing on ambiguous, highly strategic emerging channels, potential programs &amp; measurement optimization/instrumentation.\n\nYou will be responsible for defining, measuring, and accelerating our growth in new digital landscapes, specifically focusing on AI Engine Optimization (AEO), brand measurement and advanced attribution frameworks.\n\nPartnering closely with the global marketing analytics teams, Brand, Demand Gen, PR, Content, and Developer Network teams, you will execute against high-visibility objectives enabling marketing&#39;s business impact to executive leadership.\n\nIdeally, this position has a strong statistical background, a substantial capacity for hypothesis-driven thinking, and the ability to deliver &quot;unmistakable clarity&quot; in complex executive presentations.\n\nResponsibilities\n\nIn this role, you’ll:\n\nAnalyse LLM visibility and Share of Voice, create a continuous feedback loop across various teams to optimise Twilio&#39;s presence in AI search/AEO (answer engine optimisation), drive enablement and operational efforts to ensure consistent tool usage and storyline.\n\nSupport the strategy, data ingestion, and modelling required to build and maintain net new and existing brand measurement components. Drive a single, cohesive narrative combining disparate data sets.\n\nInspire experimentation adoption in teams. Build the frameworks and structure to drive experimentation rigor and clarity of incrementally of marketing initiatives.\n\nSupport Media Mix Modelling (MMM) and Multi-Touch Attribution (MTA) initiatives to understand the impact of various marketing channels on Sign Ups, SQLs, activations, and guide budget allocation.\n\nTranslate complex data sets into actionable insights and strategic recommendations for exec-level audiences.\n\nOwn enablement efforts for emerging channels, sharing accomplishments, KPIs, and cross-team collaborations broadly across the marketing organisation.\n\nQualifications\n\nTwilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply.\n\nIf your career is just starting or hasn’t followed a traditional path, don’t let that stop you from considering Twilio.\n\nWe are always looking for people who will bring something new to the table!\n\n* Required:\n\n5+ years of experience in statistical analysis, experimentation and/or scenario planning.\n\nA strong understanding of SEO and emerging AI Search (AEO/GEO) landscapes.\n\nA hypothesis-driven thinker who does not just report the “what,” but actively tests the “why”. Able to formulate testable guesses and distill the essential from complex marketing data.\n\nExperience with running or interpreting advanced measurement frameworks (Incrementality, MMM, MTA).\n\nHighly reliable and a self-starter.\n\nInterest or experience in applying AI-driven solutions to analytics workflows.\n\nStrong communication skills with the ability to communicate effectively with both technical and non-technical audiences\n\nDetail-oriented with a passion for clean, trustworthy data.\n\nStrong analytical skills.\n\nExcellent collaborator across cross-functional teams; skilled in managing stakeholder projects end-to-end.\n\nDesired:\n\nExperience supporting various stakeholder functions and their insights: Brand, PR, Social, Content, Demand Generation\n\nExperience with BI tools (Tableau, Looker Studio) and modelling (ROI, LTV:CAC, payback).\n\nInterest or exposure to fraud monitoring tools and processes.\n\nWorking knowledge of Customer Data Platforms (CDPs) and analytics data integrations.\n\nLocation\n\nThis role will be remote, and based in Alberta, British Columbia, or Ontario, Canada.\n\nTravel\n\nWe prioritise connection and opportunities to build relationships with our customers and each other.\n\nFor this role, approximately 15% travel is anticipated to help you connect in-person in a meaningful way.\n\nWhat We Offer\n\nWorking at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings programme, and much more.\n\nOfferings vary by location.\n\nBased on role, employees may also be eligible for additional compensation and benefits, including but not limited to incentive programmes, commissions, equity grants, health and wellness benefits, retirement contributions, and paid time off.\n\nThe estimated pay ranges for this role are as follows:\n\n$90,800 - $113,500 CAD\n\nTarget Bonus Percentage 12.50%\n\nThe successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location.\n\nTwilio thinks big. Do you?\n\nWe like to solve problems, take initiative, pitch in when needed, and are always up for trying new things.\n\nThat&#39;s why we seek out colleagues who embody our values , something we call Twilio Magic.\n\nAdditionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.\n\nSo, if you&#39;re ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!\n\nIf this role isn’t what you’re looking for, please consider other open positions.\n\nTwilio is proud to be an equal opportunity employer.\n\nWe do not discriminate based upon race, religion, colour, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics.\n\nWe also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.\n\nQualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act.\n\nAdditionally, Twilio participates in the E-Verify programme in certain locations, as required by law.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$90,800 - $113,500 CAD</Salaryrange>
      <Skills>statistical analysis, experimentation, scenario planning, SEO, AI Search (AEO/GEO) landscapes, hypothesis-driven thinking, advanced measurement frameworks, Incrementality, Media Mix Modelling (MMM), Multi-Touch Attribution (MTA), communication skills, data analysis, clean, trustworthy data, analytical skills, collaboration, project management, BI tools (Tableau, Looker Studio), modelling (ROI, LTV:CAC, payback), fraud monitoring tools and processes, Customer Data Platforms (CDPs), analytics data integrations</Skills>
      <Category>Marketing</Category>
      <Industry>Technology</Industry>
      <Employername>Twilio</Employername>
      <Employerlogo>https://logos.yubhub.co/twilio.com.png</Employerlogo>
      <Employerdescription>Twilio delivers innovative solutions to hundreds of thousands of businesses and empowers millions of developers worldwide to craft personalized customer experiences.</Employerdescription>
      <Employerwebsite>https://www.twilio.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/twilio/jobs/7736722</Applyto>
      <Location>Remote - Canada</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>dc943b12-110</externalid>
      <Title>Engineering Manager, Onboarding</Title>
      <Description><![CDATA[<p><strong>Job Title</strong></p>
<p>Engineering Manager, Onboarding</p>
<p><strong>About Us</strong></p>
<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>
<p><strong>Job Description</strong></p>
<p>You will lead an engineering team focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, and integrations workflows that help customers realize value quickly. This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding experience.</li>
<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>
<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>
<li>Drive execution of the Onboarding roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>
<li>Lead and manage a team of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>
<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>
<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>
<li>6+ years of software engineering experience with strong technical depth.</li>
<li>3+ years of experience managing or leading engineers in a high-growth environment.</li>
<li>Strong technical background and understanding of software development principles.</li>
<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>
<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>
<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>
<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>
</ul>
<p><strong>Bonus Points</strong></p>
<ul>
<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>
<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer facing experiences.</li>
<li>You have started your own technology venture or were an early technical founder/employee.</li>
</ul>
<p><strong>Compensation</strong></p>
<p>The expected salary range for this role is $240,000 - $300,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$240,000 - $300,000</Salaryrange>
      <Skills>Software engineering, Leadership, Technical direction, Cross-functional collaboration, Data-driven decision making, AI-powered product experiences, LLM-driven automation, Personalization, Data platforms, Onboarding and implementation, Identity and workflow automation, Customer lifecycle products</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8461599002</Applyto>
      <Location>New York, New York, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>d0793a44-d91</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have:</p>
<ul>
<li>Databricks Certification</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461328002</Applyto>
      <Location>Charlotte, North Carolina</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>5fd85b1e-563</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
<li>Nice to have: Databricks Certification</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, design and deployment of highly performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456965002</Applyto>
      <Location>Dallas, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>faa865dc-a1d</externalid>
      <Title>Senior Data Engineer, BizTech</Title>
      <Description><![CDATA[<p>We&#39;re seeking a hands-on expert to provide technical leadership in addressing BizTech&#39;s diverse data engineering needs and driving long-term strategies and best practices.</p>
<p>As a Senior Data Engineer, you&#39;ll lead the design, implementation, and testing of data systems, from architecture to production. You&#39;ll build batch and real-time data systems that support business needs and critical products, ensuring data systems&#39; quality, performance, and stability through rigorous monitoring and quality assurance practices.</p>
<p>You&#39;ll collaborate with cross-functional teams, including product managers, data scientists, and engineers, to develop scalable systems and drive data-driven decisions. You&#39;ll maintain strong partnerships with backend, data science, and machine learning teams to ensure seamless integration of data systems.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Leading the design, implementation, and testing of data systems, from architecture to production</li>
<li>Building batch and real-time data systems that support business needs and critical products</li>
<li>Ensuring data systems&#39; quality, performance, and stability through rigorous monitoring and quality assurance practices</li>
<li>Collaborating with cross-functional teams to develop scalable systems and drive data-driven decisions</li>
<li>Maintaining strong partnerships with backend, data science, and machine learning teams to ensure seamless integration of data systems</li>
</ul>
<p>We&#39;re looking for someone with 9+ years of relevant experience, a Bachelor&#39;s/Master&#39;s degree in CS/EE, and extensive experience in designing, building, and operating distributed data platforms. You should be proficient in Java, Scala, or Python, with strong skills in data processing and SQL querying. Proven track record of designing and optimizing batch and real-time data pipelines is a must.</p>
<p>In addition to technical expertise, we&#39;re looking for someone with excellent written and verbal communication skills, with the ability to influence stakeholders and convey complex technical concepts. You should be a strong leader and mentor, with experience guiding teams on best practices and technical strategies.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Java, Scala, Python, data processing, SQL querying, distributed data platforms, batch and real-time data pipelines, machine learning, data science, backend development</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Airbnb</Employername>
      <Employerlogo>https://logos.yubhub.co/airbnb.com.png</Employerlogo>
      <Employerdescription>Airbnb is a global online marketplace for short-term vacation rentals, with over 5 million hosts and 2 billion guest arrivals.</Employerdescription>
      <Employerwebsite>https://www.airbnb.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/airbnb/jobs/7640881</Applyto>
      <Location>Bangalore, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>9b657c4e-8a1</externalid>
      <Title>Member of Technical Staff - Data Platform</Title>
      <Description><![CDATA[<p><strong>About the Role</strong></p>
<p>As a software engineer on the Data Platform team, you will design, build, and operate the distributed systems powering X&#39;s data movement and compute. You will take ownership of infrastructure components that process trillions of events daily, driving the scalability, performance, and reliability of the systems that power product and ML workloads across the company.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design and implement high-throughput, low-latency data ingestion and transport systems.</li>
<li>Scale and optimize multi-tenant Kafka infrastructure supporting real-time workloads.</li>
<li>Extend and tune Spark, Flink, and Trino for demanding production pipelines.</li>
<li>Build interfaces, APIs, and pipelines enabling teams to query, process, and move data at petabyte scale.</li>
<li>Debug and optimize distributed systems, with a focus on reliability and performance under load.</li>
<li>Collaborate with ML, product, and infrastructure teams to unblock critical data workflows.</li>
</ul>
<p><strong>Basic Qualifications</strong></p>
<ul>
<li>Proven expertise in distributed systems, stream processing, or large-scale data platforms.</li>
<li>Proficiency in Rust, Go, Scala or similar systems languages.</li>
<li>Hands-on experience with Kafka, Flink, Spark, Trino, or Hadoop in production.</li>
<li>Strong debugging, profiling, and performance optimization skills.</li>
<li>Track record of shipping and maintaining critical infrastructure.</li>
<li>Comfortable working in fast-moving, high-stakes environments with minimal guardrails.</li>
</ul>
<p><strong>Compensation and Benefits</strong></p>
<p>$180,000 - $440,000 USD</p>
<p>Base salary is just one part of our total rewards package at X, which also includes equity, comprehensive medical, vision, and dental coverage, access to a 401(k) retirement plan, short &amp; long-term disability insurance, life insurance, and various other discounts and perks.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,000 - $440,000 USD</Salaryrange>
      <Skills>distributed systems, stream processing, large-scale data platforms, Rust, Go, Scala, Kafka, Flink, Spark, Trino, Hadoop, debugging, profiling, performance optimization</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>xAI</Employername>
      <Employerlogo>https://logos.yubhub.co/x.ai.png</Employerlogo>
      <Employerdescription>xAI creates AI systems to understand the universe and aid humanity in its pursuit of knowledge.</Employerdescription>
      <Employerwebsite>https://www.x.ai/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/xai/jobs/4803862007</Applyto>
      <Location>Palo Alto, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0e9a97ce-6d9</externalid>
      <Title>Enterprise Account Executive, Manufacturing &amp; Automotive</Title>
      <Description><![CDATA[<p>As an Enterprise Account Executive, Manufacturing &amp; Automotive based in Tokyo, you&#39;ll drive the adoption of safe, frontier AI technology across Japan&#39;s manufacturing and automotive industries. You&#39;ll own the full sales cycle from prospecting to closing, working with senior leaders at OEMs, tier-1 suppliers, and industrial manufacturers to help them accelerate R&amp;D, optimize production operations, and transform engineering and knowledge work with Anthropic&#39;s AI solutions.</p>
<p>Responsibilities:</p>
<ul>
<li>Own and exceed revenue targets by winning new accounts and expanding relationships across Japan&#39;s major automotive OEMs, tier-1 and tier-2 suppliers, heavy industry manufacturers, electronics manufacturers, and industrial equipment companies</li>
<li>Build and execute territory plans, identifying high-value opportunities across passenger vehicle and commercial vehicle OEMs, automotive parts suppliers, industrial machinery manufacturers, semiconductor and electronics producers, and materials/chemicals companies</li>
<li>Lead consultative sales processes with senior stakeholders , including chief engineers, R&amp;D directors, heads of digital transformation, manufacturing engineering leaders, connected-vehicle and software-defined vehicle (SDV) leads, and plant operations executives , positioning AI as a driver for engineering productivity, design and simulation acceleration, supply chain intelligence, factory floor optimization, and in-vehicle/product experiences</li>
<li>Orchestrate internal teams (Product, Engineering, Applied AI, Partnerships) to deliver solutions tailored to the rigorous safety, quality, and reliability standards of automotive and manufacturing workflows, while respecting IP protection and data sovereignty requirements</li>
<li>Stay deeply informed about trends across Japan&#39;s manufacturing and automotive sectors and global industry shifts , including electrification, SDV, autonomous driving, Industry 4.0, and smart factory initiatives , ensuring Anthropic is consistently positioned as a relevant and forward-thinking partner to companies navigating rapid technological transformation</li>
</ul>
<p>You may be a good fit if you have:</p>
<ul>
<li>8+ years of enterprise sales experience in Japan with significant exposure to the manufacturing or automotive industry , whether selling directly into OEMs, tier-1 suppliers, industrial manufacturers, or adjacent technology partners (PLM, CAD/CAE, MES, industrial software, cloud/data platforms) within the ecosystem</li>
<li>A genuine understanding of how product development, manufacturing operations, and engineering organizations work in Japan, including the distinct dynamics of keiretsu relationships, long-cycle OEM–supplier partnerships, and the coexistence of traditional monozukuri culture with digital transformation initiatives</li>
<li>A track record of building trusted relationships with technically sophisticated engineering and manufacturing teams, navigating organizations where chief engineers, R&amp;D leaders, and plant/operations executives hold significant influence over technology decisions alongside traditional IT leadership</li>
<li>Demonstrated ability to manage complex, multi-stakeholder sales cycles typical of automotive and manufacturing enterprises , where decisions often involve multiple business units, rigorous POC and evaluation processes, and long-horizon strategic planning</li>
<li>Proven experience exceeding revenue targets by identifying and closing opportunities across a diverse portfolio of manufacturing and automotive companies with varying scales, business models, and digital maturity</li>
<li>A knack for bringing order to chaos and an enthusiastic &#39;roll up your sleeves&#39; mentality , you are a true team player who thrives in ambiguous, startup-like environments</li>
<li>A strategic, analytical approach to identifying opportunities within Japan&#39;s manufacturing and automotive sectors combined with relationship-focused execution that earns trust with engineering, manufacturing, and business leaders alike</li>
<li>A passion for and/or experience with advanced AI systems. You feel strongly about ensuring frontier AI systems are developed safely and responsibly for broad benefit</li>
<li>Excellent communication skills in both Japanese and English</li>
</ul>
<p>What We Offer:</p>
<ul>
<li>Competitive base salary and commission structure commensurate with experience</li>
<li>Equity participation</li>
<li>Comprehensive benefits package</li>
<li>Hybrid work model with flexibility</li>
<li>Access to cutting-edge AI technology and world-class research team</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise sales experience, Manufacturing and automotive industry knowledge, Japanese language skills, English language skills, Advanced AI systems experience, Strategic and analytical thinking, Relationship-building skills, Complex sales cycle management, Keiretsu relationships, Long-cycle OEM-supplier partnerships, Monozukuri culture, Digital transformation initiatives, Cloud/data platforms, Industrial software, Supply chain intelligence, Factory floor optimization, In-vehicle/product experiences</Skills>
      <Category>Sales</Category>
      <Industry>Automotive</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a technology company that develops safe and beneficial AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5104760008</Applyto>
      <Location>Tokyo, Japan</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7f904cf7-7bd</externalid>
      <Title>Data Analyst II</Title>
      <Description><![CDATA[<p>Join us at Brex, the intelligent finance platform that empowers companies to spend smarter and move faster in over 200 markets. As a Data Analyst II, you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>
<p>As a member of our Data organization, you will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses. This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>
<p>Responsibilities:</p>
<ul>
<li>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</li>
<li>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</li>
<li>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</li>
<li>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</li>
<li>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</li>
<li>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</li>
<li>Contribute to the automation of recurring analyses and reporting workflows using Python.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>4+ years of experience in data analytics or a related role in a professional setting.</li>
<li>3+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</li>
<li>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</li>
<li>Proficiency in Python for data analysis, automation, and scripting (Pandas, NumPy, and similar libraries).</li>
<li>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</li>
<li>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</li>
<li>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</li>
<li>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</li>
</ul>
<p>Bonus points:</p>
<ul>
<li>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</li>
<li>Familiarity with dbt for data modeling and transformation.</li>
<li>Exposure to data pipeline orchestration tools (e.g., Airflow).</li>
<li>Experience in fintech, financial services, or payments.</li>
<li>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, financial services, or payments</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Brex LLC</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is an intelligent finance platform that enables companies to spend smarter and move faster in over 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8463703002</Applyto>
      <Location>São Paulo, São Paulo, Brazil</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7ffabac7-275</externalid>
      <Title>Director, Solutions &amp; Forward Deployed Engineering</Title>
      <Description><![CDATA[<p>We are seeking a Director, Solution &amp; Forward Deployed Engineering to lead the technical delivery of the Zus platform and help customers successfully connect their systems, data, and applications to Zus.</p>
<p>Reporting to the Head of Customer Success &amp; Delivery, this role will own how customers integrate with the Zus platform. They’ll be responsible for ensuring healthcare organisations and digital health builders can reliably ingest data, connect EHR systems, and deploy applications powered by Zus APIs.</p>
<p>This leader will oversee teams responsible for forward deployment engineering, and technical enablement, working closely with customer engineering teams to integrate Zus into production environments and connect to data networks.</p>
<p>You will guide customers through the complexity of healthcare interoperability, helping them translate real-world workflows into scalable integrations built on Zus.</p>
<p>This is a hands-on player-coach role. You will lead the team while also personally engaging in complex implementations, architecture discussions, and customer deployments.</p>
<p>You will champion the use of AI tools, automation frameworks, and reusable integration patterns to dramatically improve how quickly and reliably customers connect to the Zus platform.</p>
<p>The ideal candidate combines deep experience in healthcare interoperability, enterprise software implementations, API platforms, and AI-enabled engineering workflows with the leadership skills required to scale a delivery organisation.</p>
<p>Key responsibilities:</p>
<ul>
<li>Lead implementation and technical delivery - Own the technical delivery lifecycle following contract signature through production deployment and early adoption</li>
<li>Lead and grow a team of Solutions Engineers and Forward Deployed Engineers - Develop career paths, performance expectations, and development plans for the team to ensure excellent execution of goals</li>
<li>Ensure consistent, high-quality execution across multiple concurrent enterprise implementations</li>
<li>Establish best practices for onboarding, implementation, integration, and go-live readiness</li>
<li>Set customers up for success across multiple different high priority use cases</li>
<li>Ensure customers achieve rapid time-to-value from the Zus platform</li>
</ul>
<p>Act as player-coach for complex implementations - Personally engage on strategic or technically complex customer deployments</p>
<p>Guide integrations involving FHIR, HL7, CCD, APIs, SFTP pipelines, and EHR platforms</p>
<p>Troubleshoot complex interoperability and data pipeline issues</p>
<p>Work directly with engineering teams to deploy and operationalize Zus products</p>
<p>Serve as a trusted technical advisor to customer technical and operational stakeholders</p>
<p>Drive forward deployed engineering - Support customers in building production-grade applications and workflows on top of Zus APIs</p>
<p>Help customers operationalize clinical and operational data across care delivery workflows</p>
<p>Lead the development of reference architectures and deployment patterns</p>
<p>Identify integration opportunities that accelerate product adoption and expansion</p>
<p>Delivery training and technical enablement - Oversee technical onboarding and training programs for new customers</p>
<p>Enable customer engineering and product teams to effectively build on the Zus platform</p>
<p>Develop documentation, workshops, and enablement resources for technical users</p>
<p>Drive AI-enabled implementation and automation - Lead the adoption of AI tools and automation frameworks across the delivery organisation</p>
<p>Identify opportunities to automate manual implementation work using LLMs, scripting, and developer tooling</p>
<p>Develop reusable automation patterns for all parts of the Zus ecosystem</p>
<p>Help customers leverage Zus data to power AI-enabled workflows and analytics applications</p>
<p>Partner with Product and Engineering - Translate customer implementation patterns into platform improvement</p>
<p>Participate in technical discussions to find reusable integration patterns that can be embedded directly into the Zus platform</p>
<p>Communicate customer needs to the Product &amp; Engineering teams</p>
<p>You&#39;re a good fit because you have:</p>
<ul>
<li>10+ years of experience in technical implementation, solutions engineering, systems integration, or professional services leadership, preferably in healthtech, SaaS, or enterprise software</li>
<li>Proven experience leading customer-facing teams and scaling implementation or professional services functions</li>
<li>Deep expertise in healthcare data interoperability, including FHIR, HL7, CCD, and EHR integrations</li>
<li>Strong understanding of APIs, data ingestion pipelines (ETL, JSON, CSV), and modern data platforms (e.g., Snowflake)</li>
<li>Experience designing scalable implementation frameworks and reusable integration patterns</li>
<li>Familiarity with secure environments and compliance frameworks (HIPAA, SOC 2)</li>
<li>Executive presence and the ability to build trust with both technical and non-technical stakeholders</li>
<li>Strong strategic thinking paired with a willingness to dive into complex technical or delivery challenges when needed</li>
<li>A self-starter mindset and comfort operating in a fast-paced, evolving startup environment</li>
<li>Passion for improving healthcare through better access to and use of data</li>
<li>Willingness to travel up to ~25% for customer engagements, industry events, and company meetings</li>
<li>Bachelor’s degree in Business, Engineering, or a related field (advanced degree a plus)</li>
</ul>
<p>Additional Information:</p>
<p>We will offer you...</p>
<ul>
<li>Competitive compensation that reflects the value you bring to the team a combination of cash and equity</li>
<li>Robust benefits that include health insurance, wellness benefits, 401k with a match, unlimited PTO</li>
<li>Opportunity to work alongside a passionate team that is determined to help change the world (and have fun doing it)</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$150,000-200,000 per year</Salaryrange>
      <Skills>Healthcare data interoperability, Enterprise software implementations, API platforms, AI-enabled engineering workflows, Leadership skills, FHIR, HL7, CCD, EHR integrations, APIs, Data ingestion pipelines, Modern data platforms, Scalable implementation frameworks, Reusable integration patterns, Secure environments, Compliance frameworks, Executive presence, Strategic thinking, Self-starter mindset, Passion for improving healthcare</Skills>
      <Category>Engineering</Category>
      <Industry>Healthcare</Industry>
      <Employername>Zus</Employername>
      <Employerlogo>https://logos.yubhub.co/zus.com.png</Employerlogo>
      <Employerdescription>Zus is a shared health data platform designed to accelerate healthcare data interoperability by providing easy-to-use patient data via API, embedded components, and direct EHR integrations.</Employerdescription>
      <Employerwebsite>https://zus.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/zushealth/de7b4911-901f-4548-9d68-9b77c0ccf6b6</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>facf5d80-7bd</externalid>
      <Title>Solutions Engineer, Delivery &amp; Automation</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Solutions Engineer who gets energized by solving gnarly technical problems and making customers wildly successful. As the technical quarterback for new customer onboardings, you&#39;ll translate their vision into working integrations, navigate the chaos of healthcare data standards, and ensure they extract real value from day one.</p>
<p>Key responsibilities:</p>
<p>Own the technical journey - Lead end-to-end onboarding for new customers,from authentication setup to data mart configuration</p>
<p>Integrate customer systems with Zus (APIs, SFTP, HL7, FHIR,the whole interoperability stack)</p>
<p>Translate messy business requirements into clean technical architectures</p>
<p>Build and maintain automated workflows that make implementations faster and more reliable</p>
<p>Drive customer success through technical excellence - Be the trusted technical advisor customers call when things get complicated</p>
<p>Run technical deep dives and implementation reviews that actually move the needle</p>
<p>Identify integration risks before they become blockers and solve them proactively</p>
<p>Train customers on best practices so they become power users, not support tickets</p>
<p>Innovate on process - Use AI tools (LLMs, automation platforms, scripting) to eliminate manual work and scale your impact</p>
<p>Build templates, scripts, and tooling that make the 10th implementation faster than the 1st</p>
<p>Document learnings and create repeatable playbooks through automation that make the whole team better</p>
<p>Collaborate with R&amp;D - Partner closely with Product and Engineering to surface integration challenges and opportunities for platform improvement</p>
<p>Translate real-world customer integration patterns into product feedback and roadmap insights</p>
<p>Collaborate with R&amp;D teams on emerging capabilities around AI, data pipelines, and developer tooling</p>
<p>Act as the voice of the customer when identifying opportunities to improve developer experience and reduce integration friction</p>
<p>You&#39;ll enjoy solving messy integration challenges, building automation that eliminates manual work, and partnering closely with Product and Engineering to continuously improve the platform.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$125,000-165,000 per year</Salaryrange>
      <Skills>healthcare data standards (FHIR, HL7, CCD), major EMRs (Epic, Cerner, athenahealth), API and data pipeline experience (ETL, REST APIs, JSON, CSV ingestion), data platforms (Snowflake, SQL databases) including schema design and query optimization, Python scripting skills and SQL fluency, secure environments and compliance (HIPAA, SOC2), AI tools (LLMs, automation platforms, scripting), data pipelines, developer tooling</Skills>
      <Category>Engineering</Category>
      <Industry>Healthcare</Industry>
      <Employername>Zus</Employername>
      <Employerlogo>https://logos.yubhub.co/zus.com.png</Employerlogo>
      <Employerdescription>Zus is a shared health data platform designed to accelerate healthcare data interoperability. It was founded in 2021 by Jonathan Bush, co-founder and former CEO of athenahealth.</Employerdescription>
      <Employerwebsite>https://zus.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/zushealth/fbe45c72-4269-4c7f-b88c-6df3349c2479</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>6acd8036-5ec</externalid>
      <Title>Platform Engineer (Databases &amp; Storage)</Title>
      <Description><![CDATA[<p>We are looking for a Staff Platform Engineer to own the database and storage foundation of World Labs. This is a high-impact systems role at the intersection of databases, distributed systems, and AI infrastructure. You will define how core data systems are designed, scaled, and operated in an environment where workloads are evolving quickly and requirements are often ambiguous.</p>
<p>Your responsibilities will include owning the design and evolution of the transactional systems that power the platform, defining architecture for database and storage systems under high-throughput, low-latency workloads, making and driving decisions around data modeling, indexing, replication, and consistency, debugging and resolving complex production issues, establishing standards for reliability, observability, and operability across the platform, partnering with product and research teams to support evolving and often ambiguous requirements, driving improvements in performance, scalability, and cost across the system, mentoring engineers and raising the bar for system design and technical decision-making.</p>
<p>Key qualifications include 10+ years of experience building and operating production systems at scale, with ownership of critical infrastructure, strong experience designing and operating transactional systems and databases, deep understanding of data modeling, indexing, transactions, concurrency, and consistency tradeoffs, experience owning systems with strict reliability and performance requirements in production, strong experience debugging complex production issues and reasoning about failure modes, experience designing distributed systems or large-scale infrastructure where tradeoffs are non-trivial, proven ability to define architecture and drive technical decisions end-to-end, strong judgment in balancing performance, reliability, and cost, ability to operate effectively in ambiguous, fast-moving environments with high ownership.</p>
<p>Preferred qualifications include experience with database internals, storage systems, or query engines, experience building infrastructure for AI/ML systems or data platforms, experience in early-stage or high-growth environments.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$200-$300k base salary (good-faith estimate for San Francisco Bay Area upon hire; actual offer based on experience, skills, and qualifications)</Salaryrange>
      <Skills>database internals, storage systems, query engines, data modeling, indexing, transactions, concurrency, consistency, distributed systems, large-scale infrastructure, AI/ML systems, data platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>World Labs</Employername>
      <Employerlogo>https://logos.yubhub.co/worldlabs.ai.png</Employerlogo>
      <Employerdescription>World Labs builds foundational world models that can perceive, generate, reason, and interact with the 3D world.</Employerdescription>
      <Employerwebsite>https://www.worldlabs.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/worldlabs/jobs/4194381009</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>93a43345-780</externalid>
      <Title>FinOps Program Manager</Title>
      <Description><![CDATA[<p>We believe that the way people interact with their finances will drastically improve in the next few years. We&#39;re dedicated to empowering this transformation by building the tools and experiences that thousands of developers use to create their own products. Plaid powers the tools millions of people rely on to live a healthier financial life.</p>
<p>The FinOps function is responsible for financial accountability, visibility, and optimization across all engineering-related spend at Plaid. This includes cloud infrastructure, AI/ML and data workloads, third-party SaaS tools, and other technical investments that support Plaid&#39;s products and internal platforms.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Monitors and analyzes engineering spend across cloud, AI/ML, data platforms, and SaaS, identifying trends, anomalies, and optimization opportunities.</li>
<li>Builds and maintains forecasts for engineering spend, partnering with Finance and engineering leaders to understand drivers, assumptions, and risks.</li>
<li>Partners with engineering, product, and TPMs to incorporate cost considerations into roadmaps, architectural decisions, and execution plans.</li>
<li>Leads cost optimization initiatives, such as rightsizing, commitment strategies, and workload efficiency improvements, in collaboration with engineering owners.</li>
<li>Creates and maintains dashboards and reporting that make spend understandable and actionable for both engineers and executives.</li>
<li>Implements FinOps practices and processes, including showback/chargeback models, unit economics, and cost ownership frameworks.</li>
<li>Partners on tooling and automation, working with data and engineering teams to improve cost visibility, forecasting accuracy, and operational efficiency.</li>
<li>Drives alignment and behavior change, helping teams balance cost, performance, reliability, and velocity through data-driven decision making.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>6–10+ years of relevant experience working at the intersection of engineering, infrastructure, data, or finance in a cloud-native or SaaS environment.</li>
<li>Proven experience partnering closely with engineering teams to influence decisions involving cloud infrastructure, data platforms, AI/ML workloads, or SaaS spend.</li>
<li>Working understanding of modern cloud-native architectures, including core components such as compute, storage, networking, data pipelines, and managed services,enough to engage credibly with engineers on design, tradeoffs, and cost drivers.</li>
<li>Strong foundation in cost analysis, forecasting, budgeting, and variance management, with the ability to translate data into clear, actionable insights.</li>
<li>Comfort working directly with data, including writing SQL (or effectively using AI-assisted tools to do so) to explore datasets, validate assumptions, and answer ad hoc questions.</li>
<li>Experience building clear, high-quality dashboards and BI artifacts that are not only accurate, but intuitive and delightful for engineers and leaders to use.</li>
<li>Demonstrated success driving adoption and behavior change,embedding cost awareness into day-to-day engineering workflows, not just producing reports.</li>
<li>Experience owning and delivering cross-functional programs end-to-end, often without direct authority or a dedicated team.</li>
<li>Familiarity with FinOps principles and practices (e.g., shared ownership, showback/chargeback, unit economics, optimization strategies).</li>
<li>Strong communication skills, with the ability to tailor complex technical and financial concepts for engineering, finance, and executive audiences.</li>
</ul>
<p><strong>Nice to Haves</strong></p>
<ul>
<li>Hands-on familiarity with cloud cost management tools (e.g., AWS Cost Explorer, GCP Billing, Azure Cost Management, CloudHealth, Cloudability, or similar).</li>
<li>Experience working with or supporting data platforms and AI/ML workloads, including understanding cost drivers for batch processing, streaming, storage, and model training/inference.</li>
<li>Exposure to showback/chargeback models, cost allocation strategies, or product-level unit economics.</li>
<li>Experience improving data models or pipelines that support analytics, reporting, or financial attribution.</li>
<li>Familiarity with BI tools such as Mode, Tableau, Looker, or similar,and a strong eye for dashboard usability and design.</li>
<li>Background in a technical role (e.g., engineering, TPM, infra, data, or engineering operations) before moving into a more cross-functional or business-oriented position.</li>
<li>Experience operating in a high-growth or rapidly scaling environment, where cost structures and investment priorities are evolving quickly.</li>
</ul>
<p><strong>Additional Information</strong></p>
<p>Our mission at Plaid is to unlock financial freedom for everyone. To support that mission, we seek to build a diverse team of driven individuals who care deeply about making the financial ecosystem more equitable. We recognize that strong qualifications can come from both prior work experiences and lived experiences. We encourage you to apply to a role even if your experience doesn&#39;t fully match the job description.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$172,800-$259,200 per year</Salaryrange>
      <Skills>cloud infrastructure, AI/ML, data platforms, SaaS, cost analysis, forecasting, budgeting, variance management, SQL, data visualization, dashboard creation, cross-functional program management, FinOps principles, showback/chargeback models, unit economics, optimization strategies, cloud cost management tools, data platforms and AI/ML workloads, cost allocation strategies, product-level unit economics, BI tools, technical role background, high-growth environment experience</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Plaid</Employername>
      <Employerlogo>https://logos.yubhub.co/plaid.com.png</Employerlogo>
      <Employerdescription>Plaid is a financial technology company that provides a platform for developers to connect their financial accounts to various applications and services.</Employerdescription>
      <Employerwebsite>https://plaid.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/plaid/acb399b1-e0f8-45f3-bffa-c89c9c573a12</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>08842cf0-459</externalid>
      <Title>Account Executive - Enterprise</Title>
      <Description><![CDATA[<p>About Mistral AI</p>
<p>At Mistral AI, we believe in the power of AI to simplify tasks, save time, and enhance learning and creativity. Our technology is designed to integrate seamlessly into daily working life.</p>
<p>We are a global company with teams distributed between France, USA, UK, Germany, and Singapore. Our diverse workforce thrives in competitive environments and is committed to driving innovation.</p>
<p>Role Summary</p>
<p>As an Enterprise Account Executive, you will play a pivotal role in driving Mistral&#39;s adoption among our most strategic enterprise customers. We are looking for highly action-oriented individuals who thrive in fast-paced environments, love making deals, and have a strong bias for execution.</p>
<p>Responsibilities</p>
<p>Lead Development (Strategic Outbound &amp; Qualified Inbound)</p>
<ul>
<li>Drive strategic outreach &amp; leverage introductions to engage high-potential enterprise customers</li>
<li>Convert inbound opportunities into high-value partnerships, including upsells and bespoke enterprise agreements</li>
<li>Build and maintain a strong pipeline of qualified opportunities</li>
</ul>
<p>Value Proposition Validation</p>
<ul>
<li>Support enterprise customers through Proof of Concept (POC) phases, ensuring a smooth and impactful evaluation process</li>
<li>Translate successful evaluations into long-term production contracts by demonstrating clear ROI and business impact</li>
<li>Align Mistral&#39;s capabilities with customer strategic priorities</li>
</ul>
<p>Deal Management &amp; Closing</p>
<ul>
<li>Develop and execute strategic sales plans to convert leads into long-term customers</li>
<li>Act as the primary point of contact for external stakeholders throughout the entire sales cycle</li>
<li>Lead negotiations and collaborate with legal, technical, and implementation teams to finalize agreements</li>
</ul>
<p>Executive Engagement</p>
<ul>
<li>Build strong relationships with C-level executives, innovation leaders, and senior decision-makers</li>
<li>Understand customer strategic priorities and position Mistral&#39;s AI capabilities as a critical enabler of their initiatives</li>
<li>Guide executive stakeholders through complex technology adoption decisions</li>
</ul>
<p>Technical Collaboration</p>
<ul>
<li>Develop a strong understanding of Mistral&#39;s AI platform and technical capabilities</li>
<li>Work closely with implementation and engineering teams to address technical questions and ensure successful deployments</li>
</ul>
<p>Training &amp; Enablement</p>
<ul>
<li>Share customer insights internally to inform product development and strategy</li>
<li>Help internal teams better understand enterprise customer needs and market opportunities</li>
</ul>
<p>Who You Are</p>
<ul>
<li><p>Consulting or Strategic Background</p>
</li>
<li><p>Experience in strategy consulting (e.g., McKinsey, BCG, Bain, or similar firms)</p>
</li>
<li><p>Strong ability to structure complex problems and translate strategy into action</p>
</li>
</ul>
<p>Commercial Mindset</p>
<ul>
<li>Highly action-oriented with a strong bias toward execution</li>
<li>Passion for deal-making and turning opportunities into closed agreements</li>
<li>Ability to operate in fast-paced and ambiguous environments</li>
</ul>
<p>Executive Presence</p>
<ul>
<li>Comfortable engaging and influencing C-level executives</li>
<li>Strong communication and storytelling skills</li>
</ul>
<p>Nice to Have</p>
<ul>
<li>Experience working on large transformation projects (AI, infrastructure, telecom, energy, or digital transformation)</li>
<li>Familiarity with AI, data platforms, or enterprise software ecosystems</li>
<li>Ideally had a first role in sales; business development, deal closing roles</li>
</ul>
<p>What We Offer</p>
<ul>
<li>Competitive cash salary and equity</li>
<li>Food: Daily lunch vouchers</li>
<li>Sport: Monthly contribution to a Gym pass subscription</li>
<li>Transportation: Monthly contribution to a mobility pass</li>
<li>Health: Full health insurance for you and your family</li>
<li>Parental: Generous parental leave policy</li>
<li>Visa sponsorship</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>strategy consulting, deal-making, execution, complex problem-solving, communication, storytelling, influencing C-level executives, large transformation projects, AI, data platforms, enterprise software ecosystems</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Mistral AI</Employername>
      <Employerlogo>https://logos.yubhub.co/mistral.ai.png</Employerlogo>
      <Employerdescription>Mistral AI develops and provides AI technology for enterprise use, with a comprehensive AI platform designed for on-premises or cloud environments.</Employerdescription>
      <Employerwebsite>https://mistral.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/mistral/2a357282-9d44-4b41-a249-c75ffe878ce2</Applyto>
      <Location>Paris</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>3e6de12f-61a</externalid>
      <Title>Senior Technical Program Manager</Title>
      <Description><![CDATA[<p>As Mercury grows, our revenue platforms increasingly sit at the intersection of product launches, data systems, go-to-market motion, and external vendors. We&#39;re hiring a Technical Program Manager for Revenue Technology to ensure that complex, multi-team initiatives actually move: in the right order, with the right dependencies surfaced, and with real progress visible week over week.</p>
<p>This is not a status-meeting role. This is a technical, hands-on TPM who understands systems well enough to unblock work directly , whether that&#39;s in Linear, Salesforce, data workflows, or coordination across Product, Engineering, Data, and Revenue.</p>
<p>In this role, you will report to the Head of Platforms &amp; Infrastructure and play a critical role in protecting execution focus so specialized technical roles can operate at full effectiveness.</p>
<p>Some things you&#39;ll do on the job:</p>
<ul>
<li>Own delivery sequencing and execution momentum across Revenue Technology initiatives</li>
<li>Translate ambiguous goals into clear scopes, milestones, and dependency maps</li>
<li>Actively manage work in Linear and related tools, ensuring priorities are visible and intentional</li>
<li>Identify when work is blocked, drifting, over-scoped, or colliding , and intervene early</li>
<li>Partner closely with Solution Architecture, Data Strategy, Engineering, and Systems &amp; Workflow Experience</li>
<li>Perform light hands-on technical work (configuration, stopgaps, tooling hygiene) to unblock progress</li>
<li>Absorb coordination load across Product, Engineering, Data Engineering, Data Science, Revenue, and vendors</li>
<li>Drive planning cycles anchored to Cosmic Objectives, not ad-hoc urgency</li>
<li>Make tradeoffs explicit and documented so teams can move without re-litigating decisions</li>
</ul>
<p>You should have:</p>
<ul>
<li>5+ years experience as a Technical Program Manager, Technical PM, or equivalent role in a SaaS or fintech environment</li>
<li>Demonstrated ability to drive delivery across multi-team, cross-functional initiatives (Product, Engineering, Data, Revenue, vendors)</li>
<li>Strong technical fluency , able to reason about systems, data flows, integrations, and platform constraints</li>
<li>Hands-on experience managing work in tools like Linear, Jira, or similar, with a bias toward clarity and momentum</li>
<li>Proven ability to surface dependencies, sequence work, and intervene when execution is blocked or drifting</li>
<li>Excellent written and verbal communication skills, especially in ambiguous environments</li>
<li>Comfort performing light hands-on technical work (tool configuration, workflow setup, stopgap solutions)</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Experience supporting GTM, revenue, or customer lifecycle platforms</li>
<li>Familiarity with Salesforce, data platforms, or revenue tooling ecosystems</li>
<li>Experience operating within Agile / Kanban systems , with judgment about when to adapt them</li>
<li>Background working closely with architecture or platform engineering teams</li>
</ul>
<p>The total rewards package at Mercury includes base salary, equity (stock options), and benefits. Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry. New hire offers are made based on a candidate’s experience, expertise, geographic location, and internal pay equity relative to peers.</p>
<p>Our target new hire base salary ranges for this role are the following:</p>
<ul>
<li>US employees in New York City, Los Angeles, Seattle, or the San Francisco Bay Area: $131,000 - 148,000</li>
<li>US employees outside of New York City, Los Angeles, Seattle, or the San Francisco Bay Area: $118,000 - $133,200</li>
<li>Canadian employees (any location): CAD $124,300 - $139,900</li>
</ul>
<p>Mercury values diversity &amp; belonging and is proud to be an Equal Employment Opportunity employer. All individuals seeking employment at Mercury are considered without regard to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, sexual orientation, or any other legally protected characteristic.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$118,000 - $148,000</Salaryrange>
      <Skills>Technical Program Management, Revenue Technology, Linear, Salesforce, Data Workflows, Solution Architecture, Data Strategy, Engineering, Systems &amp; Workflow Experience, GTM, Revenue, Customer Lifecycle Platforms, Data Platforms, Revenue Tooling Ecosystems, Agile/Kanban Systems</Skills>
      <Category>Finance</Category>
      <Industry>Fintech</Industry>
      <Employername>Mercury</Employername>
      <Employerlogo>https://logos.yubhub.co/mercury.com.png</Employerlogo>
      <Employerdescription>Mercury is a fintech company that provides banking services through Choice Financial Group and Column N.A.</Employerdescription>
      <Employerwebsite>https://www.mercury.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mercury/jobs/5856800004</Applyto>
      <Location>San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>782f7e7f-e0e</externalid>
      <Title>Revenue Technology - Data Strategy &amp; Operations Lead</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Data Strategy &amp; Operations leader to own the data foundations that power revenue execution. This role ensures that revenue data is reliable, interpretable, scalable, and usable as the business evolves and that teams can act on what they see with confidence.</p>
<p>In this role, you will report to the Head of Platforms &amp; Infrastructure and play a central role in shaping how Mercury models, governs, and operationalizes GTM data. You’ll partner closely with Data Engineering, Data Science, Solution Architecture, Platform Engineering, etc.</p>
<p>Some key responsibilities include:</p>
<ul>
<li>Owning the definition, structure, and reliability of data originating from revenue platforms (e.g., Salesforce, GTM tools, automation systems)</li>
<li>Serving as the primary decision owner for GTM-sourced tables and views used for revenue execution, forecasting inputs, lifecycle tracking, and signal-based workflows</li>
<li>Designing and evolving core GTM data models across Salesforce, ETL, and analytics layers</li>
<li>Partnering with Data Engineering to align GTM schemas with enterprise data models and define clear data contracts between source systems and downstream consumers</li>
<li>Partnering with Data Science / Analytics to ensure revenue data is interpretable, statistically sound, and reflects how the business actually operates</li>
<li>Owning clarity around data ownership boundaries, shared dependencies, and escalation paths when upstream or downstream changes impact revenue integrity</li>
<li>Defining and upholding data quality, freshness, consistency, and documentation standards for revenue platforms</li>
<li>Monitoring and improving pipeline reliability, performance, and scalability, proactively identifying fragile or redundant transformations</li>
<li>Identifying opportunities to automate manual or error-prone data workflows and reduce operational overhead</li>
<li>Acting as a data thought partner to Platforms &amp; Infrastructure, Revenue Operations, Analytics, and Security , advising on feasibility, tradeoffs, and sequencing for data-heavy initiatives</li>
</ul>
<p>You should have:</p>
<ul>
<li>7+ years of experience in data engineering or data systems roles within SaaS or technology companies</li>
<li>Deep experience designing and operating production data pipelines</li>
<li>Highly proficient in SQL and experienced in data modeling</li>
<li>Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery, Redshift)</li>
<li>Experience with ETL / ELT tooling (e.g., dbt, Airflow, Census, or similar)</li>
<li>Understanding of Salesforce data models and common GTM system architectures</li>
<li>Ability to translate business concepts into durable, well-structured data models</li>
<li>Clear communication skills with both technical and non-technical partners</li>
</ul>
<p>Preferred qualifications include:</p>
<ul>
<li>Experience supporting revenue, sales, or customer lifecycle data</li>
<li>Familiarity with event-based data platforms (e.g., Data Cloud or equivalents)</li>
<li>Experience working alongside platform engineering and security teams</li>
<li>Exposure to data governance, access controls, and compliance considerations</li>
<li>Experience mentoring or guiding other data practitioners</li>
</ul>
<p>The total rewards package at Mercury includes base salary, equity, and benefits. Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$142,600 - $198,000</Salaryrange>
      <Skills>SQL, data modeling, modern data stacks, ETL/ELT tooling, Salesforce data models, GTM system architectures, event-based data platforms, data governance, access controls, compliance considerations, mentoring/guiding other data practitioners</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Mercury</Employername>
      <Employerlogo>https://logos.yubhub.co/mercury.com.png</Employerlogo>
      <Employerdescription>Mercury is a fintech company that provides banking services through Choice Financial Group and Column N.A.</Employerdescription>
      <Employerwebsite>https://www.mercury.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mercury/jobs/5806201004</Applyto>
      <Location>San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>ed25fe37-186</externalid>
      <Title>Solutions Engineer</Title>
      <Description><![CDATA[<p>As a Solutions Engineer at Hebbia, you will be a revenue-critical leader who owns the technical strategy in complex, high-scrutiny enterprise evaluations. You will operate at the intersection of commercial leadership and AI execution, shaping structured solution strategies while proving real value during the buying process.</p>
<p>You will bring clarity to ambiguity. You will translate complex enterprise problems into structured, Hebbia-driven solution approaches that executives can confidently act on. At the same time, you will design tailored AI-powered workflows and evaluations that validate impact, reduce decision risk, and strengthen the commercial narrative.</p>
<p>You will operate as a trusted advisor to sophisticated buyers navigating ambiguity, risk, and scale. Your job is to clarify thinking, reduce decision risk, and prove value in environments where accuracy, speed, and explainability matter.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Owning technical strategy from discovery through close in complex enterprise sales cycles</li>
<li>Leading rigorous technical discovery to understand workflows, data realities, constraints and executive success criteria</li>
<li>Translating ambiguous customer challenges into clear structured Hebbia solution frameworks</li>
<li>Designing and configuring tailored AI-powered workflows to prove differentiated value during evaluations</li>
<li>Delivering high-stakes live demos and technical discussions with senior stakeholders</li>
<li>Surfacing risks early, challenging assumptions, and shaping mitigation strategies that accelerate deal progression</li>
<li>Partnering closely with Sales to increase win rates, deal velocity, and executive confidence</li>
<li>Providing structured product feedback that sharpens positioning and informs roadmap decisions</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>5+ years in a customer-facing technical role within complex enterprise sales environments (Solutions Engineering, Sales Engineering, Consulting, Implementation, or similar)</li>
<li>Deep experience with AI systems, data platforms, LLM-based workflows or decision intelligence platforms</li>
<li>Demonstrable executive presence, able to earn trust from C-Suite practitioners</li>
<li>Strong at structuring ambiguous problems into clear, defensible solution approaches</li>
<li>Strong preference for experience selling to or performing Analyst roles in Banking, Asset Management, Private Markets, Legal or Enterprise Knowledge teams in complex enterprise environments</li>
<li>Comfortable operating in high-expectation, performance-driven environments</li>
</ul>
<p>Compensation includes a 70:30 split between team and individual performance, with a combined base + bonus compensation range of $150,000 - $200,000 + competitive equity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$150,000 - $200,000</Salaryrange>
      <Skills>AI systems, data platforms, LLM-based workflows, decision intelligence platforms, Solutions Engineering, Sales Engineering, Consulting, Implementation, Banking, Asset Management, Private Markets, Legal, Enterprise Knowledge</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Hebbia</Employername>
      <Employerlogo>https://logos.yubhub.co/hebbia.com.png</Employerlogo>
      <Employerdescription>Hebbia is an AI platform for investors and bankers that generates alpha and drives upside. Founded in 2020, it powers investment decisions for major asset managers.</Employerdescription>
      <Employerwebsite>https://hebbia.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/hebbia/jobs/4680820005</Applyto>
      <Location>London, UK</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>77bbde4e-977</externalid>
      <Title>Solutions Engineer</Title>
      <Description><![CDATA[<p>As a Solutions Engineer at Hebbia, you will be a revenue-critical leader who owns the technical strategy in complex, high-scrutiny enterprise evaluations. You will operate at the intersection of commercial leadership and AI execution, shaping structured solution strategies while proving real value during the buying process.</p>
<p>You will bring clarity to ambiguity. You will translate complex enterprise problems into structured, Hebbia-driven solution approaches that executives can confidently act on. At the same time, you will design tailored AI-powered workflows and evaluations that validate impact, reduce decision risk, and strengthen the commercial narrative.</p>
<p>You will operate as a trusted advisor to sophisticated buyers navigating ambiguity, risk, and scale. Your job is to clarify thinking, reduce decision risk, and prove value in environments where accuracy, speed, and explainability matter.</p>
<p>Key responsibilities include owning technical strategy from discovery through close in complex enterprise sales cycles, leading rigorous technical discovery to understand workflows, data realities, constraints and executive success criteria, translating ambiguous customer challenges into clear structured Hebbia solution frameworks, designing and configuring tailored AI-powered workflows to prove differentiated value during evaluations, delivering high-stakes live demos and technical discussions with senior stakeholders, surfacing risks early, challenging assumptions, and shaping mitigation strategies that accelerate deal progression, partnering closely with Sales to increase win rates, deal velocity, and executive confidence, providing structured product feedback that sharpens positioning and informs roadmap decisions.</p>
<p>Required skills include 5+ years in a customer-facing technical role within complex enterprise sales environments, deep experience with AI systems, data platforms, LLM-based workflows or decision intelligence platforms, demonstrable executive presence, able to earn trust from C-Suite practitioners, strong at structuring ambiguous problems into clear, defensible solution approaches, strong preference for experience selling to or performing Analyst roles in Banking, Asset Management, Private Markets, Legal or Enterprise Knowledge teams in complex enterprise environments, comfortable operating in high-expectation, performance-driven environments.</p>
<p>Benefits include unlimited PTO, medical, dental, vision, 401K, wellness benefits, catered lunch daily, doordash dinner credit, parental leave policy, fertility benefits, and a competitive equity package with unmatched upside potential.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$150,000 - $200,000</Salaryrange>
      <Skills>AI systems, data platforms, LLM-based workflows, decision intelligence platforms, executive presence, structured problem-solving, Selling to Banking, Asset Management, Private Markets, Legal or Enterprise Knowledge teams, Experience with complex enterprise sales environments</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Hebbia</Employername>
      <Employerlogo>https://logos.yubhub.co/hebbia.com.png</Employerlogo>
      <Employerdescription>Hebbia is an AI platform for investors and bankers that generates alpha and drives upside. Founded in 2020, it powers investment decisions for major asset managers.</Employerdescription>
      <Employerwebsite>https://hebbia.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/hebbia/jobs/4556367005</Applyto>
      <Location>New York City</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>aebaacf5-640</externalid>
      <Title>Integrations Engineer</Title>
      <Description><![CDATA[<p>You will own the full lifecycle of integrations that power Hebbia&#39;s AI , from designing connectors to deploying them in production, monitoring their behavior, and debugging failures in real time.</p>
<p>You&#39;ll work across systems like Snowflake, S3, SharePoint, and internal customer infrastructure , building pipelines that need to handle real-world complexity: unreliable APIs, evolving schemas, massive datasets, and edge cases that don’t show up in documentation.</p>
<p>This role is hands-on, high-ownership, and deeply technical. You won’t just write code , you’ll develop the instincts to operate and debug complex distributed systems in production.</p>
<p>You will build connectors and ingestion pipelines that bring enterprise data into Hebbia&#39;s AI platform, from Snowflake warehouses and SharePoint libraries to live pricing feeds, high-velocity news data, and proprietary customer systems.</p>
<p>You will design and operate pipelines that handle scale, failures, and edge cases gracefully.</p>
<p>You will debug issues across APIs, auth systems, and data formats, often under real-time customer pressure.</p>
<p>You will own reliability end-to-end: monitoring, alerting, on-call, and incident response.</p>
<p>You will improve internal tooling and observability to make systems more robust and easier to operate.</p>
<p>You will partner with product and customer teams to scope, prioritize, and ship the integrations that unlock Hebbia&#39;s highest-value use cases.</p>
<p>You will design and ship agents that sit on top of the ingestion layer, making enterprise data accessible and actionable across all of Hebbia&#39;s product surfaces , from document analysis to structured query workflows.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$160,000 to $265,000</Salaryrange>
      <Skills>Python, APIs, OAuth flows, webhook patterns, rate limiting, pagination, cloud infrastructure, AWS, Kafka, PostgreSQL, Redis, ElasticSearch, enterprise data platforms, document processing pipelines, content extraction systems, agentic systems, LLM-enabled products, AI tools</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Hebbia</Employername>
      <Employerlogo>https://logos.yubhub.co/hebbia.com.png</Employerlogo>
      <Employerdescription>Hebbia is an AI platform for investors and bankers that generates alpha and drives upside, founded in 2020 by George Sivulka and backed by Peter Thiel and Andreessen Horowitz.</Employerdescription>
      <Employerwebsite>https://hebbia.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/hebbia/jobs/4675784005</Applyto>
      <Location>New York City; San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>bbaf1090-aa3</externalid>
      <Title>Senior Product Manager, Cyngn Insight &amp; Fleet Management</Title>
      <Description><![CDATA[<p>About Cyngn</p>
<p>Cyngn is a publicly-traded autonomous technology company that deploys self-driving industrial vehicles to factories, warehouses, and other facilities throughout North America. We are looking for a Senior Product Manager to lead the strategy and execution of Cyngn Insight and Fleet Management, the company&#39;s core software platform for operating, monitoring, and scaling autonomous vehicle fleets in production environments.</p>
<p>Responsibilities</p>
<ul>
<li>Own the product vision, strategy, and roadmap for Cyngn Insight and Fleet Management, including fleet operations, monitoring, analytics, safety workflows, and enterprise integrations.</li>
<li>Define and maintain a multi-quarter roadmap that balances customer needs, operational scalability, system reliability, security, and commercial objectives.</li>
<li>Act as the senior product owner for fleet and operations initiatives, driving clear prioritization, trade-off decisions, and execution across concurrent programs.</li>
<li>Partner with Engineering and Program Management to translate platform strategy into actionable requirements, milestones, and delivery plans.</li>
<li>Translate customer workflows, operational data, and deployment feedback into product requirements that improve fleet uptime, efficiency, safety visibility, and ease of use.</li>
<li>Collaborate with Sales, Business Development, Marketing, and Customer Success on go-to-market strategy, customer onboarding, and long-term account success.</li>
<li>Define, track, and communicate success metrics (KPIs) for fleet performance and platform adoption, including utilization, uptime, incident response, and customer value realization.</li>
</ul>
<p>Qualifications</p>
<ul>
<li>BS/BA in a technical field or equivalent practical experience.</li>
<li>5+ years of product management experience, including at least 2 years in a senior or lead PM role.</li>
<li>Demonstrated success delivering complex, multi-tenant software platforms or enterprise products operating at scale.</li>
<li>Strong technical aptitude with experience working closely with engineering teams on distributed systems, data platforms, or cloud-based products.</li>
<li>Excellent communication and leadership skills, with comfort engaging executives, customers, and deeply technical stakeholders.</li>
</ul>
<p>Preferred Qualifications</p>
<ul>
<li>Experience with fleet management systems, enterprise SaaS platforms, or operational analytics products.</li>
<li>Background in software development, data platforms, or systems engineering (e.g., Python, SQL, cloud infrastructure).</li>
<li>Familiarity with autonomous vehicles, robotics operations, industrial automation, or large-scale deployments in logistics, manufacturing, or warehousing.</li>
</ul>
<p>Benefits &amp; Perks</p>
<ul>
<li>Health benefits (Medical, Dental, Vision, HSA and FSA (Health &amp; Dependent Daycare), Employee Assistance Program, 1:1 Health Concierge)</li>
<li>Life, Short-term and long-term disability insurance (Cyngn funds 100% of premiums)</li>
<li>Company 401(k)</li>
<li>Commuter Benefits</li>
<li>Flexible vacation policy</li>
<li>Sabbatical leave opportunity after 5 years with the company</li>
<li>Paid Parental Leave</li>
<li>Daily lunches for in-office employees and fully-stocked kitchen with snacks and beverages</li>
<li>Monthly meal and tech allowances for remote employees</li>
<li>Allowance to purchase new headphones when you join!</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$160,000-180,000 per year</Salaryrange>
      <Skills>product management, software development, data platforms, cloud-based products, fleet management systems, enterprise SaaS platforms, operational analytics products, Python, SQL, cloud infrastructure, autonomous vehicles, robotics operations, industrial automation, large-scale deployments</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Cyngn</Employername>
      <Employerlogo>https://logos.yubhub.co/cyngn.com.png</Employerlogo>
      <Employerdescription>Cyngn is a publicly-traded autonomous technology company that deploys self-driving industrial vehicles to factories, warehouses, and other facilities throughout North America.</Employerdescription>
      <Employerwebsite>https://www.cyngn.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/cyngn/29a77488-5a84-4242-aba3-a5cca4a6681c</Applyto>
      <Location>Mountain View</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>3d849fbc-058</externalid>
      <Title>Member of Product, Data Platform</Title>
      <Description><![CDATA[<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto.</p>
<p>The Data Platform team is the backbone of Anchorage Digital&#39;s information infrastructure. As data becomes the lifeblood of every product, compliance workflow, and client-facing report we produce, this team is responsible for building and operating a unified, scalable, and reliable data platform that serves the entire organization.</p>
<p>As a Data Platform Product Manager, you will own the strategy and execution for centralizing and formalizing the company&#39;s data infrastructure , spanning internal operational data, transaction and blockchain data, customer data, and external data sources.</p>
<p>Your mission is to transform a fragmented data landscape into a single source of truth that powers mission-critical reporting, business insights, and downstream product experiences across every team at Anchorage.</p>
<p>This is a force-multiplier role. Your work will elevate the quality, speed, and reliability of every product and team at the company.</p>
<p>You will define the standards, build the platform, and create the foundation that enables Anchorage to scale with confidence.</p>
<p>If you thrive at the intersection of complex data systems, cross-functional influence, and platform thinking, this is your opportunity to have outsized impact at a category-defining company in digital assets.</p>
<p>Below, we define our Factors of Growth &amp; Impact to help Anchorage Villagers measure their impact and articulate feedback, coaching, and the rich learning that happens while exploring, developing, and mastering capabilities within and beyond the Member of Product, Data Platform role:</p>
<p><strong>Technical Skills:</strong></p>
<ul>
<li>Own the detailed prioritization of the data platform roadmap, balancing foundational infrastructure work, new capabilities, and technical debt.</li>
<li>Demonstrate deep strategic thinking in shaping the platform roadmap, considering the unique data challenges of digital assets, blockchain protocols, and regulated financial services.</li>
<li>Deliver complex, cross-functional projects with multiple dependencies across engineering, analytics, compliance, and operations teams.</li>
<li>Work closely with engineering and data science counterparts to drive product development processes, sprint planning, and architectural decisions.</li>
<li>Ability to understand and reason about system architecture , including data warehousing, ETL/ELT pipelines, streaming vs. batch processing, and modern data stack components , and communicate clear requirements to engineering.</li>
<li>Drive comprehensive go-to-market strategy for internal platform adoption, including defining success metrics, tracking KPIs around data quality and platform usage, and iterating based on data-driven insights.</li>
</ul>
<p><strong>Complexity and Impact of Work:</strong></p>
<ul>
<li>Lead and influence cross-functional teams while maintaining strong stakeholder relationships across the entire organization , from engineering to finance to compliance.</li>
<li>Exercise independent decision-making and take full ownership of data platform strategy and execution.</li>
<li>Contribute strategic insights that significantly impact company direction, operational efficiency, and product quality.</li>
<li>Demonstrate platform leadership that elevates the performance and effectiveness of every team that depends on data.</li>
</ul>
<p><strong>Organizational Knowledge:</strong></p>
<ul>
<li>Develop deep understanding of Anchorage&#39;s business model, product suite, regulatory environment, and organizational structure.</li>
<li>Build and maintain strong relationships with stakeholders across all departments to ensure the data platform serves the company&#39;s most critical needs.</li>
<li>Navigate and improve organizational data practices to enhance efficiency, compliance, and decision-making.</li>
<li>Drive company objectives through strategic data platform decisions and initiatives.</li>
</ul>
<p><strong>Communication and Influence:</strong></p>
<ul>
<li>Effectively influence and motivate teams across the organization to adopt platform standards and invest in data quality, even when those teams do not report to you.</li>
<li>Enable cross-functional collaboration through clear, consistent communication about platform capabilities, timelines, and data governance expectations.</li>
<li>Act as a thoughtful knowledge partner to senior leadership, translating complex data infrastructure topics into clear business impact.</li>
<li>Proactively communicate platform goals, status updates, and data health metrics throughout the organization.</li>
</ul>
<p><strong>You may be a fit for this role if you:</strong></p>
<ul>
<li>5+ years of product management experience, with significant time spent on data platforms, data infrastructure, or data-intensive enterprise products.</li>
<li>Proven experience building or scaling enterprise data platforms , including data warehousing, data lakes, ETL/ELT pipelines, or modern data stack tooling (e.g., Snowflake, Databricks, dbt, Airflow, Spark).</li>
<li>Strong understanding of data modeling, data governance, and data quality frameworks.</li>
<li>Experience working with diverse data types , including transactional data, customer data, financial data, and ideally blockchain or on-chain data.</li>
<li>Track record of driving cross-functional alignment and adoption for internal platform products where you must influence without direct authority.</li>
<li>Exceptional written and verbal communication skills, with the ability to convey complex data architecture concepts to both technical and non-technical audiences.</li>
<li>Your empathy and adaptability not only complement others&#39; working styles but also embody our culture of curiosity, creativity, and shared understanding.</li>
<li>You self describe as some combination of the following: creative, humble, ambitious, detail oriented, hard working, trustworthy, eager to learn, methodical, action oriented, and tenacious.</li>
</ul>
<p><strong>Although not a requirement, bonus points if you have:</strong></p>
<ul>
<li>You have hands-on experience with blockchain data indexing, onchain analytics, or crypto-native data infrastructure.</li>
<li>You have built data platforms that serve both internal analytics consumers and external client-facing products (reports, statements, dashboards).</li>
<li>You have experience supporting clients with data-related issues or concerns.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data platforms, data infrastructure, data-intensive enterprise products, data warehousing, data lakes, ETL/ELT pipelines, modern data stack tooling, Snowflake, Databricks, dbt, Airflow, Spark, data modeling, data governance, data quality frameworks, blockchain or on-chain data</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anchorage Digital</Employername>
      <Employerlogo>https://logos.yubhub.co/anchorage.com.png</Employerlogo>
      <Employerdescription>Anchorage Digital is a crypto platform that enables institutions to participate in digital assets through custody, staking, trading, governance, settlement, and the industry&apos;s leading security infrastructure.</Employerdescription>
      <Employerwebsite>https://anchorage.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/anchorage/0e730f61-a2e4-4152-8277-3f6383cc69a6</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>30da0df8-cc9</externalid>
      <Title>F&amp;S COE Analyst</Title>
      <Description><![CDATA[<p>We are seeking an analytically-driven Data Analyst to join our Finance &amp; Strategy team at Stripe. This role bridges the gap between data science and financial planning, requiring someone who can transform complex business data into actionable financial insights.</p>
<p>You will build sophisticated dashboards, develop predictive models, and serve as the technical backbone for our FP&amp;A and GTM analytics initiatives. This is a unique opportunity for a data professional with financial acumen to directly influence strategic business decisions in a high-growth fintech environment.</p>
<p><strong>Financial Data Analytics &amp; Modeling</strong></p>
<ul>
<li>Design, build, and maintain financial dashboards for FP&amp;A, Revenue Operations, and GTM teams using Tableau, Power BI, or Looker</li>
<li>Develop automated financial reporting solutions that reduce manual effort and improve data accuracy</li>
<li>Create sophisticated data models to support budgeting, forecasting, variance analysis, and scenario planning</li>
<li>Build predictive models for revenue forecasting, customer lifetime value, churn analysis, and unit economics</li>
</ul>
<p><strong>Business Intelligence &amp; Reporting</strong></p>
<ul>
<li>Partner with Finance Business Partners and FP&amp;A teams to translate business requirements into technical solutions</li>
<li>Design and implement data infrastructure for financial planning cycles (monthly/quarterly reviews, annual budgets, long-range planning)</li>
<li>Develop self-service analytics capabilities enabling finance teams to access real-time business insights</li>
<li>Create executive dashboards tracking key financial and operational metrics (ARR, bookings, retention, CAC, LTV)</li>
</ul>
<p><strong>Data Engineering &amp; Analytics Infrastructure</strong></p>
<ul>
<li>Write complex SQL queries to extract, transform, and analyze large datasets from multiple source systems</li>
<li>Build ETL pipelines to integrate financial data from ERP, CRM, billing, and data warehouse systems</li>
<li>Ensure data quality, consistency, and governance across financial reporting systems</li>
<li>Optimize database performance and data architecture for scalability</li>
</ul>
<p><strong>Strategic Analysis &amp; Insights</strong></p>
<ul>
<li>Conduct deep-dive analyses on business performance, identifying trends, anomalies, and opportunities</li>
<li>Support strategic initiatives through ad-hoc financial modeling and what-if scenario analysis</li>
<li>Translate complex data findings into clear, actionable recommendations for leadership</li>
<li>Collaborate with Data Science teams to develop advanced analytics and ML models for finance use cases</li>
</ul>
<p><strong>Required Qualifications</strong></p>
<ul>
<li>Advanced SQL proficiency (complex joins, window functions, CTEs, query optimization)</li>
<li>Expert-level experience with at least one BI tool (Tableau, Power BI, Looker, or Qlik)</li>
<li>Advanced Excel/Google Sheets skills (pivot tables, complex formulas, data modeling)</li>
<li>Python or R for data analysis, automation, and statistical modeling</li>
<li>Cloud data platforms (Snowflake, BigQuery, Redshift, Databricks)</li>
<li>ETL tools (dbt, Airflow, Fivetran) and version control (Git)</li>
</ul>
<p><strong>Financial &amp; Business Acumen</strong></p>
<ul>
<li>Experience in data analytics within finance, FP&amp;A, or revenue operations functions</li>
<li>Strong understanding of financial statements (P&amp;L, balance sheet, cash flow)</li>
<li>Knowledge of key financial metrics: ARR, MRR, bookings, revenue recognition, CAC, LTV, gross margin, EBITDA</li>
<li>Experience with financial planning processes: budgeting, forecasting, variance analysis, scenario modeling</li>
<li>Understanding of SaaS/subscription business models and revenue recognition principles (ASC 606 preferred)</li>
</ul>
<p><strong>Analytical &amp; Problem-Solving</strong></p>
<ul>
<li>Proven ability to work with large, complex datasets and derive meaningful insights</li>
<li>Experience building financial models and dashboards that drive executive decision-making</li>
<li>Strong statistical analysis skills and understanding of data visualization best practices</li>
<li>Track record of translating ambiguous business problems into structured analytical frameworks</li>
</ul>
<p><strong>Preferred Experience</strong></p>
<ul>
<li>Background in fintech, payments, B2B SaaS, or high-growth technology companies</li>
<li>Experience supporting GTM analytics (sales forecasting, pipeline analysis, quota setting)</li>
<li>Familiarity with finance systems: NetSuite, Anaplan, Adaptive Planning, Salesforce, Stripe Billing</li>
<li>Exposure to data science methodologies and machine learning concepts</li>
<li>Previous work in cross-functional environments collaborating with finance, data science, and business teams</li>
</ul>
<p><strong>Key Competencies</strong></p>
<ul>
<li>Business Acumen: Ability to understand complex business models and translate them into data requirements</li>
<li>Technical Excellence: Deep technical skills with commitment to code quality and best practices</li>
<li>Communication: Exceptional ability to explain technical concepts to non-technical stakeholders</li>
<li>Stakeholder Management: Experience partnering with senior leaders and influencing through data</li>
<li>Ownership Mindset: Self-directed with ability to manage multiple priorities and drive projects to completion</li>
<li>Continuous Learning: Curiosity to learn new tools, techniques, and business domains</li>
<li>Attention to Detail: Commitment to data accuracy and quality in high-stakes financial reporting</li>
</ul>
<p><strong>Education</strong></p>
<ul>
<li>Bachelor&#39;s degree in Finance, Economics, Statistics, Mathematics, Computer Science, Engineering, or related quantitative field</li>
<li>Advanced degree (MBA, MS in Analytics/Data Science) or relevant certifications (CFA, CPA, data analytics certifications) a plus</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Tableau, Power BI, Looker, Python, R, Cloud data platforms, ETL tools, Version control, Machine learning, Data science, Finance systems, Data visualization</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>Stripe</Employername>
      <Employerlogo>https://logos.yubhub.co/stripe.com.png</Employerlogo>
      <Employerdescription>Stripe is a financial infrastructure platform for businesses. It provides payment processing services to millions of companies worldwide.</Employerdescription>
      <Employerwebsite>https://stripe.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/stripe/jobs/7597624</Applyto>
      <Location>Bengaluru</Location>
      <Country></Country>
      <Postedate>2026-03-31</Postedate>
    </job>
    <job>
      <externalid>76107476-db4</externalid>
      <Title>Senior Data Scientist, Gemini App, Google DeepMind</Title>
      <Description><![CDATA[<p>We are seeking a Senior Data Scientist to join our GeminiApp team in Zurich, Switzerland. As a key partner and co-creator in our product strategy, you will be instrumental in building a uniquely proactive and powerful assistant by ensuring our strategic decisions are grounded in data.</p>
<p>Responsibilities:</p>
<ul>
<li>Partner with Verticals PM, engineering, and UX to develop data-driven product strategies</li>
<li>Translate ambiguous questions into well-defined problems, design experiments, and analyse large complex datasets for insights</li>
<li>Develop and implement novel, goal-oriented metrics</li>
<li>Build and deploy statistical/ML models to understand our users, enhance product capabilities and personalise user experience</li>
<li>Communicate findings &amp; recommendations to stakeholders, including executives</li>
<li>Champion data-driven culture by feeding user engagement insights back into models</li>
<li>Collaborate with the GenAI team on model quality and feature adoption</li>
<li>Act as a technical leader for a global team, guiding junior members on complex analyses and upholding best practices to ensure high-quality, impactful work</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Bachelor&#39;s degree in Statistics, Mathematics, Data Science, Engineering, Physics, Economics, or a related quantitative field</li>
<li>5 years of experience with analysis applications (e.g., extracting insights, performing statistical analysis, or solving business problems), and coding (e.g., Python, R, SQL) or 2 years of experience with a Master&#39;s degree</li>
<li>2 years of work experience identifying opportunities for business/product improvement and then defining/measuring the success of those initiatives</li>
<li>A bias for action and a relentless drive to build something great</li>
<li>Strong business acumen and a strategic mindset</li>
<li>Deep technical expertise</li>
<li>Exceptional communication and presentation skills</li>
<li>A collaborative and influential partner</li>
<li>A commitment to user trust and privacy</li>
</ul>
<p>Preferred qualifications include a Master&#39;s degree in a relevant field and experience working with machine learning and statistical modelling tools.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, R, SQL, Machine Learning, Statistical Modelling, Data Analysis, Data Science, Business Acumen, Strategic Thinking, Experience with deep learning frameworks, Knowledge of cloud-based data platforms, Familiarity with agile development methodologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Google DeepMind</Employername>
      <Employerlogo>https://logos.yubhub.co/deepmind.com.png</Employerlogo>
      <Employerdescription>Google DeepMind is a leading artificial intelligence research organisation that develops and applies advanced AI technologies for widespread public benefit and scientific discovery.</Employerdescription>
      <Employerwebsite>https://deepmind.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/deepmind/jobs/7560458</Applyto>
      <Location>Zurich, Switzerland</Location>
      <Country></Country>
      <Postedate>2026-03-16</Postedate>
    </job>
    <job>
      <externalid>e6811a48-f96</externalid>
      <Title>VP, Partner Engineering</Title>
      <Description><![CDATA[<p><strong>VP, Partner Engineering at Quantexa</strong></p>
<p><strong>What we&#39;re all about.</strong></p>
<p>Our DATA values—Determination, Ambition, Teamwork, Accountability—aren&#39;t just operating principles, they&#39;re our competitive advantage. We bring context to every decision with a category-defining Decision Intelligence platform, and our partner ecosystem is the multiplier.</p>
<p>As VP, Partner Engineering, you will lead the technical vision, execution, and team required to scale partner-built and partner-delivered solutions across the world&#39;s most strategic technology and consulting landscapes.</p>
<p><strong>Make Every Decision Count with Context</strong></p>
<p><strong>The opportunity.</strong></p>
<p>Reporting to the SVP Global Alliances, the VP, Partner Engineering is the executive accountable for setting the Partner Engineering strategy to transform our partner ecosystem into a best-in-class technical delivery engine that expands consumption, accelerates CoSell and Channel revenue, and drives production deployments with repeatability and governance discipline.</p>
<p>This role will include:</p>
<ul>
<li>Developing, executing and leading the Partner Engineering strategy to deliver non-linear revenue growth via the Ecosystem</li>
<li>Technical leadership of partner architecture</li>
<li>Management of Partner Solution Architects as Player Coach i.e. Lead by example</li>
<li>Execution of joint solution engineering with hyperscalers, industry ISV’s &amp; GSIs</li>
<li>Executive engagement and business impact through ecosystem consumption &amp; marketplace ARR</li>
</ul>
<p><strong>What you&#39;ll be doing.</strong></p>
<ul>
<li>Set the global Partner Engineering strategy and architecture standards aligned to cloud marketplaces, consumption incentives, and AI/data collaborations.</li>
<li>Lead co-innovation blueprints and quick starts integrating our platform with the likes of Microsoft Azure, GCP, AWS, Databricks, Guidewire style decision systems, and GSI solution frameworks.</li>
<li>Personally sponsor and govern the reference architecture, connector strategy, AI/data interop strategy, and workload optimization patterns for partner solutions.</li>
<li>Create enterprise-grade integration frameworks and enforce architectural guardrails that accelerate deployment while minimizing partner/customer delivery risk.</li>
<li>Partner with Product and R&amp;D for cross function collaboration and alignment around roadmap and innovation</li>
<li>Lead, mentor, and scale a global team of Partner Solution Architects focused on:</li>
</ul>
<ul>
<li>Joint solution architecture</li>
<li>Partner-led PoCs/PoVs</li>
<li>Demo and sandbox platforms</li>
<li>Deployment acceleration</li>
<li>Technical certifications and readiness</li>
</ul>
<ul>
<li>Build an elite Partner Architecture Centre of Excellence that rivals top ecosystem engineering orgs in the industry.</li>
<li>Define clear role expectations, capability uplift plans, and partner field alignment models for PSAs.</li>
<li>Establish performance-driven culture anchored in partner adoption velocity, PoV win rate, demo adoption, deployment acceleration, and marketplace ARR.</li>
<li>Partner with Field Alliances, Field Engineering and RVP’s to enable and ensure Ecosystem attribution</li>
<li>Own hyperscaler and ISV technical partnership co-innovation roadmaps, ensuring:</li>
</ul>
<ul>
<li>Marketplace-ready joint solutions</li>
<li>Consumption and integration patterns tied to incentives</li>
<li>Field delivery kits for PSA + partner sellers</li>
<li>Hardened reference designs for repeatable pursuits</li>
</ul>
<ul>
<li>Drive GSI partnerships into factory-style delivery motions, launching industry solutions and field-ready plays with global systems integrators.</li>
<li>Govern multi-party architecture risk, security, deployment patterns, rapid time-to-value, and executive sponsor checkpoints.</li>
<li>Be the Evangelist for Ecosystem opportunities and own the hypothesis qualification for embedded or OEM, co-developed offerings and new pursuits</li>
<li>Working closely with Product, define our ecosystem’s technical differentiation narrative—anchored in decision context, AI/data intelligence, workflows, and enterprise integration depth—against point solutions and platform incumbents.</li>
<li>Ensure PSAs and partners can articulate why our platform wins architecturally, commercially, and operationally.</li>
</ul>
<p><strong>Requirements</strong></p>
<p><strong>What you&#39;ll bring.</strong></p>
<ul>
<li>Proven executive leadership building partner engineering teams within enterprise software ecosystems</li>
<li>Deep technical fluency in data platforms, AI/ML, analytics, cloud integration patterns, connectors, workflow systems and marketplace consumption architectures.</li>
<li>Experience launching production-hardened joint solutions with hyperscalers and GSIs with measurable revenue and deployment outcomes.</li>
<li>Ability to translate platform capability into partner-owned architectures that scale across customer estates.</li>
<li>Practitioner’s mindset for governance of PoVs, partner certifications, deployment velocity, risk mitigation, and technical adoption KPIs</li>
<li>Extensive experience in senior partner engineering, sales engineering or ecosystem technical leadership within large-scale enterprise software or SaaS companies.</li>
<li>Excellent experience of managing architect/engineering-focused partner teams, PSAs, and global technical enablement functions.</li>
<li>Demonstrated success driving multi-cloud partner consumption ARR, joint solution launches, and co-sell acceleration.</li>
<li>Executive communication credibility with partner CTO/CIO/VP engineering stakeholders and hyperscaler/GSI technical leaders.</li>
<li>Comfort operating across legal, sales, product, security, presales, and partner delivery functions.</li>
<li>Willingness to travel globally (&gt;40%).</li>
</ul>
<p><strong>Benefits</strong></p>
<p><strong>Our perks and quirks.</strong></p>
<p>What makes you Q will help you to realize your full potential, flourish and enjoy what you do, while being recognized and rewarded with our broad range of benefits.</p>
<p>We know that just having an excellent glass door rating isn’t enough, so we’ve put together a competitive package as a way of saying thank you for all your hard work and dedication.</p>
<p>We offer:</p>
<p>Competitive salary Company bonus Flexible working hours in a hybrid workplace &amp; free access to global WeWork locations &amp; events Pension Scheme with a company contribution of 6% (if you contribute 3%) 25 days annual leave (with the option to buy up to 5 days) + birthday off! Work from Anywhere Scheme: Spend up to 2 months working outside of your country of employment over a rolling 12-month period Family: Enhanced Maternity, Paternity, Adoption, or Shared Parental Leave Health &amp; Wellbeing: Private Healthcare, EAP, Well-being Days, Calm App, Gym Discounts Team&#39;s Social Budget &amp; Company-wide Summer &amp; Winter Parties Tech &amp; Cycle-to-Work Schemes Volunteer Day off Dog-friendly Offices</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data platforms, AI/ML, analytics, cloud integration patterns, connectors, workflow systems, marketplace consumption architectures, partner engineering, partner architecture, joint solution engineering, hyperscalers, industry ISV’s, GSIs, executive engagement, ecosystem consumption, marketplace ARR, global team management, partner solution architects, joint solution architecture, partner-led PoCs/PoVs, demo and sandbox platforms, deployment acceleration, technical certifications, performance-driven culture, partner adoption velocity, PoV win rate, demo adoption, deployment acceleration, marketplace ARR, GSI partnerships, factory-style delivery motions, industry solutions, field-ready plays, global systems integrators, multi-party architecture risk, security, deployment patterns, rapid time-to-value, executive sponsor checkpoints, ecosystem opportunities, embedded or OEM, co-developed offerings, new pursuits, technical differentiation narrative, decision context, AI/data intelligence, workflows, enterprise integration depth, point solutions, platform incumbents, practitioner’s mindset, governance of PoVs, partner certifications, deployment velocity, risk mitigation, technical adoption KPIs, senior partner engineering, sales engineering, ecosystem technical leadership, large-scale enterprise software, SaaS companies, architect/engineering-focused partner teams, PSAs, global technical enablement functions, multi-cloud partner consumption ARR, joint solution launches, co-sell acceleration, executive communication credibility, partner CTO/CIO/VP engineering stakeholders, hyperscaler/GSI technical leaders, comfort operating across legal, sales, product, security, presales, partner delivery functions, willingness to travel globally</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Quantexa</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Quantexa is a software company that offers a Decision Intelligence platform. It has a global presence.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/899Vt1BX63R4TGffvrdv1a/hybrid-vp%2C-partner-engineering-in-london-at-quantexa</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7af16166-8fd</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p><strong>Key Responsibilities:</strong></p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
<li>Mentor developers and create reference implementations/frameworks.</li>
<li>Partners with System Architects to elaborate capabilities and features.</li>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Over 6 years of experience as a senior domain architect for Data domains</li>
<li>Advanced English Level</li>
<li>Masters&#39; degree (PLUS)</li>
<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>
</ul>
<p><strong>Technical &amp; Business Skills:</strong></p>
<ul>
<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>
<li>Data Architecture / Data Modeling – Advanced (MUST)</li>
<li>Data Warehouse – Advanced (MUST)</li>
<li>Cloud Data Platforms - Advanced</li>
<li>Data Integration Tools – Advanced</li>
<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>
<li>Any Cloud - Intermediate (4-6 Years)</li>
<li>Power BI or Tableau - Intermediate (4-6 Years)</li>
<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>
<li>Data Lakehouse – Intermediate (MUST)</li>
</ul>
<ul>
<li>Data Governance - Intermediate</li>
<li>AI/ML - Entry Level (PLUS)</li>
<li>Master Data Management - Intermediate</li>
<li>Operational Data Management - Intermediate</li>
</ul>
<p><strong>Benefits:</strong></p>
<p>This position comes with a competitive compensation and benefits package.</p>
<ul>
<li>A competitive salary and performance-based bonuses.</li>
<li>Comprehensive benefits package.</li>
<li>Flexible work arrangements (remote and/or office-based).</li>
<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>
<li>Private Health Insurance.</li>
<li>Paid Time Off.</li>
<li>Training &amp; Development opportunities in partnership with renowned companies.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, AI/ML, Master Data Management, Operational Data Management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global consulting and technology services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/jdUFHSPZZjHsgd3TR4R3BS/remote-fbs-senior-data-domain-architect-in-colombia-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>592c119a-bbe</externalid>
      <Title>FSI Sales Director - Japan</Title>
      <Description><![CDATA[<p><strong>FSI Sales Director - Japan</strong></p>
<p><strong>What We&#39;re Looking For</strong></p>
<p>We&#39;re looking for a highly motivated sales professional to drive growth in financial institutions across Japan. As a FSI Sales Director, you will be responsible for executing the sales go-to-market strategy, identifying new opportunities, building our pipeline, winning new deals, and meeting company targets.</p>
<p><strong>What You&#39;ll Be Doing</strong></p>
<ul>
<li>Define and execute the Japan Financial Services go-to-market strategy in partnership with the Head of Sales Asia &amp; Japan, SME teams, and APAC Alliances Director.</li>
<li>New logo acquisition across Japan&#39;s mega banks and insurance institutions in a high-impact hunting role.</li>
<li>Consistently achieve revenue targets while expanding footprint within strategic accounts.</li>
<li>Build and progress a high-quality pipeline through disciplined prospecting and structured account planning.</li>
<li>Engage C-suite, business, data, and technology leaders, positioning Quantexa as a strategic enterprise partner.</li>
<li>Orchestrate complex, multi-stakeholder enterprise sales cycles, driving opportunities to close.</li>
<li>Collaborate cross-functionally with senior leadership and Solution Engineering to shape compelling technical and commercial outcomes.</li>
<li>Execute aligned account strategies that convert executive engagement into tangible commercial results.</li>
<li>Tokyo-based role focused on Japan&#39;s Financial Services sector.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>A deep understanding of the Japan financial services landscape</li>
<li>Experience selling complex, multi-stakeholder, multi-month enterprise solutions</li>
<li>Expertise working and solution selling within Japan mega banks and Insurance</li>
<li>Ambitious and energetic with strong inter-personal skills</li>
<li>Good team player, capable of delivering results in less than perfect circumstances</li>
<li>Emotionally intelligent and good reader of people</li>
<li>Fluent Japanese and strong business-level English required as a minimum</li>
<li>Experience of working across multi-national cross functional teams, such as pre-sales, product, marketing, customer success etc...</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive base salary</li>
<li>Company bonus</li>
<li>Free Calm App Subscription</li>
<li>Annual leave, national holidays and your birthday off!</li>
<li>On-going personal development</li>
<li>Country specific benefits</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Sales, Enterprise Sales, Complex Sales, Financial Services, Japan Financial Services, Mega Banks, Insurance Institutions, Advanced Data Analytics, Big Data, AI/ML, Financial crime/KYC and Fraud solutions, Data Management, Data Platforms</Skills>
      <Category>Sales</Category>
      <Industry>Finance</Industry>
      <Employername>Quantexa</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Quantexa is a software company that helps businesses grow by making data easier to manage. It has a diverse workforce with over 50 nationalities and speaks over 20 languages.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/nYUmfvxNtu6vKseaT3HG16/hybrid-fsi-sales-director---japan-in-tokyo-at-quantexa</Applyto>
      <Location>Tokyo</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7b03b30a-b20</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>
<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p>**Key Responsibilities:*</p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
</ul>
<ul>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
</ul>
<ul>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
</ul>
<ul>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
</ul>
<ul>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
</ul>
<ul>
<li>Mentor developers and create reference implementations/frameworks.</li>
</ul>
<ul>
<li>Partners with System Architects to elaborate capabilities and features.</li>
</ul>
<ul>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, Master Data Management, Operational Data Management, AI/ML</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/1U952YA2QBa8zK7Tm5d3Lm/remote-fbs-senior-data-domain-architect-in-mexico-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>8b0e9386-fa9</externalid>
      <Title>Data Engineering &amp; Data Science Consultant</Title>
      <Description><![CDATA[<p><strong>Data Engineering &amp; Data Science Consultant</strong></p>
<p>You will work hands-on on the design, build, and operationalisation of modern data and analytics solutions. You will contribute across the full lifecycle – from data ingestion and transformation to analytics, machine learning, and production deployment. You will collaborate closely with data engineers, architects, data scientists, and business stakeholders to deliver scalable, reliable, and value-driven data solutions in complex client environments.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Apply data science and machine learning techniques to real-world business problems</li>
<li>Work with structured and semi-structured data in data lakes, lakehouses, and data warehouses</li>
<li>Develop and optimise data transformations for analytical and machine learning workloads</li>
<li>Support the productionisation of data and ML solutions, including monitoring and optimisation</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>3–5 years of experience in data engineering, data science, or analytics</li>
<li>Hands-on experience delivering data and analytics solutions in project-based or client environments</li>
<li>Strong problem-solving skills and a pragmatic, delivery-oriented mindset</li>
</ul>
<p><strong>Data Engineering Foundations</strong></p>
<ul>
<li>Experience building end-to-end data pipelines (ingestion, transformation, storage)</li>
<li>Solid understanding of data modelling, data transformations, and feature engineering</li>
<li>Familiarity with cloud-based data platforms, such as Azure, AWS, or GCP</li>
</ul>
<p><strong>Applied Data Science &amp; Analytics</strong></p>
<ul>
<li>Experience applying statistical analysis and machine learning techniques</li>
<li>Strong programming skills in Python</li>
<li>Very good SQL skills and experience working with relational databases</li>
</ul>
<p><strong>Nice to have</strong></p>
<ul>
<li>Experience with streaming technologies (e.g. Kafka, Azure Event Hubs)</li>
<li>Exposure to GenAI, NLP, time series, or advanced analytics use cases</li>
<li>Experience with NoSQL databases (e.g. MongoDB, Cosmos DB)</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data science, machine learning, data engineering, cloud-based data platforms, data modelling, data transformations, feature engineering, Python, SQL, relational databases, streaming technologies, GenAI, NLP, time series, advanced analytics, NoSQL databases</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. The company is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/43f8dm12rcrpZUsa228TbZ/data-engineering-%26-data-science-consultant-in-london-at-infosys-consulting---europe</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>aafa7b92-fa6</externalid>
      <Title>Senior Consultant - Data Engineering &amp; Data Science (m/w/d)</Title>
      <Description><![CDATA[<p>Are you looking to advance your career and work with experienced, talented colleagues to successfully solve the most important challenges of our clients? We are growing further and looking for enthusiastic individuals to strengthen our team. You will be part of a dynamic, strongly growing company with over 300,000 employees.</p>
<p>Our dynamic organisation allows you to work across topics and bring in your ideas, experiences, creativity, and goal orientation. Are you ready?</p>
<p>As a Consultant/Senior Consultant in the Data Engineering &amp; Data Science field, you will work hands-on on the conception, development, and implementation of modern data and analytics solutions. You will support the entire project lifecycle - from data intake and transformation to analytics and machine learning to productive operation.</p>
<p>You will work closely with data engineers, architects, data scientists, and subject matter experts to implement scalable, reliable, and value-adding solutions in complex customer environments.</p>
<p><strong>Your Tasks</strong></p>
<ul>
<li>Apply data science methods (machine learning, deep learning, GenAI) to solve concrete business questions</li>
<li>Work with structured and semi-structured data in data lakes, lakehouses, and data warehouses</li>
<li>Set up data pipelines for analytical workloads</li>
<li>Support the productive implementation of data and ML solutions, including monitoring and optimisation</li>
</ul>
<p><strong>What You Bring - Required</strong></p>
<ul>
<li>At least 3 years of relevant professional experience in the field of data engineering, data science, or analytics</li>
<li>Hands-on experience in implementing data and analytics solutions in (customer) projects</li>
<li>Strong problem-solving skills and a pragmatic, implementation-oriented way of working</li>
</ul>
<p><strong>Data Engineering Fundamentals</strong></p>
<ul>
<li>Experience in setting up data pipelines (ingestion, transformation, storage)</li>
<li>Solid understanding of data modeling, data transformations, and feature engineering</li>
<li>Experience with cloud-based data platforms, such as:</li>
</ul>
<ol>
<li>Azure, AWS, or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse/Microsoft Fabric</li>
</ol>
<ul>
<li>Knowledge of CI/CD concepts and production-ready deployments</li>
</ul>
<p><strong>Applied Data Science &amp; Analytics</strong></p>
<ul>
<li>Experience in applying GenAI, deep learning, and machine learning procedures as well as statistical analyses</li>
<li>Very good programming skills in Python</li>
<li>Very good SQL skills and experience with relational databases</li>
<li>Experience in deploying and productively using ML models</li>
<li>Ability to translate analytical results into business-relevant insights</li>
<li>Bachelor&#39;s or master&#39;s degree in computer science, engineering, mathematics, or a related field, or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Experience with:</li>
</ul>
<ol>
<li>Streaming technologies (e.g. Kafka, Azure Event Hubs)</li>
<li>Time series analysis, NLP applications, or system modeling</li>
<li>NoSQL databases (e.g. MongoDB, Cosmos DB)</li>
<li>Docker and Kubernetes</li>
<li>Data visualization tools like Power BI, Tableau</li>
<li>Cloud or architecture certifications</li>
</ol>
<p><strong>Language &amp; Mobility (Germany)</strong></p>
<ul>
<li>Fluent German skills (at least C1) for customer communication in the German-speaking market</li>
<li>Very good English skills</li>
<li>Project-related travel readiness</li>
</ul>
<p><strong>Your Team</strong></p>
<p>You will become part of our growing Data &amp; Analytics teams. In this area, you will work with modern technologies in modern data ecosystems. You have the opportunity to turn your own ideas into results - in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, and Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>You will become an employee of a globally renowned management consulting firm at the forefront of technological innovation and industrial transformation. We work across industries with leading companies. Our culture is inclusive and entrepreneurial. As a mid-sized consulting firm embedded in the size of Infosys, we can support our customers worldwide and throughout the entire transformation process in a partnership-like manner.</p>
<p>Our values IC-LIFE - Inclusion, Equity &amp; Diversity, Client, Leadership, Integrity, Fairness, and Excellence - form our compass of values. Further information can be found on our career website.</p>
<p>In Europe, we are awarded by the Financial Times and Forbes as one of the leading consulting firms. Infosys is ranked among the top employers in Germany 2023 and has been certified by the Top Employers Institute for outstanding working conditions in Europe for five consecutive years.</p>
<p>We offer a market-leading salary, attractive additional benefits, and excellent opportunities for further education and development. Have you become curious? Then we look forward to your application - apply now!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Science, Machine Learning, Deep Learning, GenAI, Data Engineering, Data Warehousing, Data Lakes, Lakehouses, Data Pipelines, Cloud-based Data Platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse, Microsoft Fabric, CI/CD, Python, SQL, Relational Databases, Streaming Technologies, Time Series Analysis, NLP Applications, System Modeling, NoSQL Databases, Docker, Kubernetes, Data Visualization Tools, Cloud Certifications, Architecture Certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting is a globally renowned management consulting firm that works with a market-leading brand in every sector, while its parent organization Infosys is a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/ecAfMkjFkA97qaoimVMGNF/hybrid-(senior)-consultant---data-engineering-%26-data-science-(m%2Fw%2Fd)--deutschlandweit-in-munich-at-infosys-consulting---europe</Applyto>
      <Location>Munich, Bavaria, Germany</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>630a981c-b19</externalid>
      <Title>Digital Marketing Architect - Consumer Goods, Retail and Logistics - Germany</Title>
      <Description><![CDATA[<p>Boost your career and collaborate with expert, talented colleagues to solve and deliver against our clients&#39; most important challenges. We are growing and are looking for people to join our team.</p>
<p>The Role --------</p>
<p>We are seeking a visionary and experienced Digital Marketing Architect to design, build, and optimize our digital marketing technology stack. You will be the central owner of our MarTech blueprint, ensuring all platforms work in harmony to support our omnichannel retail strategy.</p>
<p>Key Responsibilities -------------------</p>
<p><strong>MarTech Stack Architecture</strong></p>
<p>Design and govern the end-to-end architecture of our marketing technology stack, including our Customer Data Platform (CDP), e-commerce platform, personalization engine, loyalty platform, and campaign management tools.</p>
<p><strong>Omnichannel Customer Journey Design</strong></p>
<p>Architect the data flows and system integrations necessary to create a unified 360-degree customer view, connecting data from online touchpoints (website, app) and offline systems (Point-of-Sale, in-store events).</p>
<p><strong>Data &amp; Personalization Strategy</strong></p>
<p>In collaboration with the data team, design the marketing data model within our CDP. Architect solutions that leverage this data to deliver real-time, personalized content, product recommendations, and offers across all digital channels.</p>
<p><strong>Technology Evaluation &amp; Roadmap</strong></p>
<p>Lead the discovery, evaluation, and selection of new marketing technologies. Develop and maintain a multi-year MarTech roadmap that aligns with strategic business objectives for growth and customer experience.</p>
<p><strong>Collaboration &amp; Enablement</strong></p>
<p>Work closely with brand marketers, e-commerce managers, and CRM specialists to understand their needs and translate them into technical requirements and solutions. Empower teams by ensuring the technology is effective and user-friendly.</p>
<p>Qualifications &amp; Skills ----------------------</p>
<p><strong>Experience</strong></p>
<p>8+ years in digital marketing technology, marketing operations, or solutions architecture. Direct experience within the retail or e-commerce industry is essential.</p>
<p><strong>MarTech Platform Expertise</strong></p>
<p>Proven hands-on experience architecting and integrating core retail marketing platforms:</p>
<ul>
<li>Customer Data Platforms (CDP): e.g., Segment, Tealium, Bloomreach</li>
<li>E-commerce Platforms: e.g., Shopify Plus, Salesforce Commerce Cloud, Magento (Adobe Commerce), Commercetools</li>
<li>Marketing/CRM Platforms: e.g., Salesforce Marketing Cloud, Braze, Emarsys</li>
<li>Personalization Engines: e.g., Dynamic Yield, Klevu, Nosto</li>
</ul>
<p><strong>Technical Proficiency</strong></p>
<ul>
<li>Strong understanding of APIs (REST, GraphQL) and data integration patterns.</li>
<li>Proficiency in SQL for data validation and analysis.</li>
<li>Solid understanding of data modeling, schema design, and identity resolution concepts.</li>
<li>Familiarity with web technologies (JavaScript, HTML, CSS) and tag management systems (Google Tag Manager).</li>
</ul>
<p><strong>Retail Business Acumen</strong></p>
<p>Deep understanding of key retail metrics (e.g., Customer Lifetime Value - CLV, Conversion Rate, Average Order Value - AOV) and the ability to connect technology solutions to business outcomes.</p>
<p>Preferred Qualifications ----------------------</p>
<ul>
<li>Experience with headless commerce and composable architecture.</li>
<li>Familiarity with loyalty program platforms and their integration.</li>
<li>Knowledge of Digital Asset Management (DAM) and Product Information Management (PIM) systems.</li>
<li>Experience in both B2C and D2C retail environments.</li>
<li>Professional fluency in German is a strong asset.</li>
</ul>
<p>About your team ----------------</p>
<p>Our CRL (Consumer Goods, retail &amp; Logistics) practice helps some of the largest global firms and most recognizable local brands solve their biggest challenges in today’s age of constant disruption. With diverse services spanning growth strategy and new product innovation, to omni-channel customer experience, supply chain resiliency and AI-driven new business models, we help clients shape and achieve their growth agenda for a sustainable future.</p>
<p>About Infosys Consulting -------------------------</p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal goals. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Customer Data Platforms (CDP), E-commerce Platforms, Marketing/CRM Platforms, Personalization Engines, APIs (REST, GraphQL), SQL, Data modeling, Schema design, Identity resolution, Web technologies (JavaScript, HTML, CSS), Tag management systems (Google Tag Manager), Retail business acumen, Headless commerce and composable architecture, Loyalty program platforms, Digital Asset Management (DAM), Product Information Management (PIM) systems, B2C and D2C retail environments, German language</Skills>
      <Category>Marketing</Category>
      <Industry>Consulting</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting is a globally renowned management consulting firm that works with market leading brands across sectors. Our parent organization Infosys is a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/d8M3v8FZmkKSxx3yZUqYJ7/hybrid-digital-marketing-architect---consumer-goods%2C-retail-and-logistics---germany-in-munich-at-infosys-consulting---europe</Applyto>
      <Location>Munich, Bavaria, Germany</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>dcfed817-412</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Senior Data Domain Architect to join our team. As a Senior Data Domain Architect, you will design and develop Data/Domain IT architecture solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Key Responsibilities:</strong></p>
<ul>
<li>Utilize in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery</li>
<li>Solve complex problems and partner effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization</li>
<li>Work independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge</li>
<li>Review high level design to ensure alignment to Solution Architecture</li>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives</li>
<li>Mentor developers and create reference implementations/frameworks</li>
<li>Partner with System Architects to elaborate capabilities and features</li>
<li>Deliver single domain architecture solutions and execute continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Over 6 years of experience as a senior domain architect for Data domains</li>
<li>Advanced English Level</li>
<li>Masters&#39; degree (PLUS)</li>
<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>
</ul>
<p><strong>Technical &amp; Business Skills:</strong></p>
<ul>
<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>
<li>Data Architecture / Data Modeling – Advanced (MUST)</li>
<li>Data Warehouse – Advanced (MUST)</li>
<li>Cloud Data Platforms - Advanced</li>
<li>Data Integration Tools – Advanced</li>
<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>
<li>Any Cloud - Intermediate (4-6 Years)</li>
<li>Power BI or Tableau - Intermediate (4-6 Years)</li>
<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>
<li>Data Lakehouse – Intermediate (MUST)</li>
</ul>
<p><strong>Benefits:</strong></p>
<ul>
<li>A competitive salary and performance-based bonuses</li>
<li>Comprehensive benefits package</li>
<li>Flexible work arrangements (remote and/or office-based)</li>
<li>Private Health Insurance</li>
<li>Paid Time Off</li>
<li>Training &amp; Development opportunities in partnership with renowned companies</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/x7tKXYFBB815ca6oBV5T2E/remote-fbs-senior-data-domain-architect-in-brazil-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>b1d522e6-6ca</externalid>
      <Title>Data Engineering &amp; Data Science Consultant</Title>
      <Description><![CDATA[<p><strong>Data Engineering &amp; Data Science Consultant</strong></p>
<p>You will work hands-on on the design, build, and operationalisation of modern data and analytics solutions. You will contribute across the full lifecycle – from data ingestion and transformation to analytics, machine learning, and production deployment. You will collaborate closely with data engineers, architects, data scientists, and business stakeholders to deliver scalable, reliable, and value-driven data solutions in complex client environments.</p>
<p><strong>Your role will include:</strong></p>
<ul>
<li>Applying data science and machine learning techniques to real-world business problems</li>
<li>Working with structured and semi-structured data in data lakes, lakehouses, and data warehouses</li>
<li>Developing and optimising data transformations for analytical and machine learning workloads</li>
<li>Supporting the productionisation of data and ML solutions, including monitoring and optimisation</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>3–5 years of experience in data engineering, data science, or analytics</li>
<li>Hands-on experience delivering data and analytics solutions in project-based or client environments</li>
<li>Strong problem-solving skills and a pragmatic, delivery-oriented mindset</li>
</ul>
<p><strong>Data Engineering Foundations</strong></p>
<ul>
<li>Experience building end-to-end data pipelines (ingestion, transformation, storage)</li>
<li>Solid understanding of data modelling, data transformations, and feature engineering</li>
<li>Familiarity with cloud-based data platforms, such as Azure, AWS, or GCP</li>
<li>Understanding of CI/CD concepts and production-grade deployments</li>
</ul>
<p><strong>Applied Data Science &amp; Analytics</strong></p>
<ul>
<li>Experience applying statistical analysis and machine learning techniques</li>
<li>Strong programming skills in Python</li>
<li>Very good SQL skills and experience working with relational databases</li>
<li>Experience deploying or supporting ML models in production environments</li>
<li>Ability to translate analytical results into business-relevant insights</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to have</strong></p>
<ul>
<li>Experience with streaming technologies (e.g. Kafka, Azure Event Hubs)</li>
<li>Exposure to GenAI, NLP, time series, or advanced analytics use cases</li>
<li>Experience with NoSQL databases (e.g. MongoDB, Cosmos DB)</li>
<li>Familiarity with Docker and Kubernetes</li>
<li>Experience with data visualisation tools (e.g. Power BI, Tableau)</li>
<li>Cloud or data-related certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data science, machine learning, data engineering, data analytics, cloud-based data platforms, Azure, AWS, GCP, Python, SQL, relational databases, data visualisation tools, Power BI, Tableau, streaming technologies, GenAI, NLP, time series, advanced analytics, NoSQL databases, Docker, Kubernetes</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting is a globally renowned management consulting firm that works with market-leading brands across sectors. It is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/sJqhkr23sMG2F6ppqk2BQn/remote-data-engineering-%26-data-science-consultant-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>56dc9a51-e66</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will act as a senior technical leader in complex data and analytics engagements. You will shape and govern end-to-end enterprise data architectures, lead technical teams, and serve as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>You will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>enterprise data architecture, system data integration, data engineering, analytics, modern data architectures, Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, cloud data platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, SQL, relational databases, Postgres, SQL Server, Oracle, NoSQL databases, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, data migration programmes, data pipelines, orchestration, automation, CI/CD concepts, production-grade deployments, distributed systems, Docker, Kubernetes, data management and governance principles, data quality, metadata, lineage, master data management, data management software and tools, security, access control, compliance considerations, Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience, advanced analytics, AI / ML or GenAI, streaming platforms, Kafka, Azure Event Hubs, data governance or metadata tools, cloud, data, architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. It is a mid-size player with a supportive, entrepreneurial spirit.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/hpBWjvvy8D6B1f818cHxZR/remote-principal-consultant---data-architecture-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>1afd04e5-198</externalid>
      <Title>Data &amp; Cloud Technical Project Manager (H/F)</Title>
      <Description><![CDATA[<p>We are a data company that helps brands improve their marketing, media and customer experience through a combination of consulting and technology services. Our team of over 320 experts includes digital consultants, data scientists, engineers and media specialists who work together to provide high-level marketing advice and technical assistance to brands across various industries.</p>
<p>Our services include data platform implementation, data governance, data analytics, and more. We work with brands to help them become omnicanal organisations that can effectively manage their digital ecosystem and its synergies with the physical world.</p>
<p>We are based in Paris and operate across three time zones from our 10 offices in Paris, London, Geneva, Milan, Shanghai, Hong Kong, Shenzhen, Taipei, Singapore and New York. We prioritise the well-being of our employees, which has enabled us to be ranked as one of the best workplaces in France in 2018.</p>
<p>We are looking for a Data &amp; Cloud Technical Project Manager to join our team. The successful candidate will be responsible for managing technical projects and teams, ensuring the successful delivery of projects within agreed timelines and budgets.</p>
<p>Key responsibilities:</p>
<ul>
<li>Manage technical projects and teams, ensuring the successful delivery of projects within agreed timelines and budgets</li>
<li>Lead technical teams and ensure the development of technical skills and expertise</li>
<li>Collaborate with data science teams to develop advanced use cases for clients</li>
<li>Develop and implement data governance frameworks for clients</li>
<li>Ensure the quality and accuracy of data and analytics</li>
<li>Collaborate with clients to understand their needs and develop solutions</li>
<li>Manage client relationships and ensure client satisfaction</li>
</ul>
<p>Requirements:</p>
<ul>
<li>6-8 years of experience in a technical consulting role</li>
<li>Experience in managing technical projects and teams</li>
<li>Strong technical skills, including data analytics, data governance and cloud computing</li>
<li>Excellent communication and presentation skills</li>
<li>Ability to work in a fast-paced environment and manage multiple projects simultaneously</li>
<li>Strong problem-solving skills and ability to think critically</li>
<li>Experience working with clients and developing solutions to meet their needs</li>
</ul>
<p>Preferred qualifications:</p>
<ul>
<li>Experience working with data platforms and data governance frameworks</li>
<li>Experience working with cloud computing platforms, including Google Cloud, Amazon Web Services and Microsoft Azure</li>
<li>Experience working with data analytics tools, including Google Data Studio and Tableau</li>
<li>Experience working with data science tools, including Python and R</li>
<li>Experience working with machine learning algorithms and models</li>
</ul>
<p>We offer a competitive salary and benefits package, including a comprehensive health insurance plan, a 401(k) matching program and a generous paid time off policy. We also offer a dynamic and supportive work environment, with opportunities for professional growth and development.</p>
<p>If you are a motivated and experienced technical professional looking for a new challenge, we encourage you to apply for this exciting opportunity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>Competitive salary and benefits package</Salaryrange>
      <Skills>data analytics, data governance, cloud computing, project management, technical leadership, data science, machine learning, data platforms, data governance frameworks, Google Cloud, Amazon Web Services, Microsoft Azure, Google Data Studio, Tableau, Python, R, machine learning algorithms, data science tools</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Fifty-Five</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Fifty-Five is a global data company that helps brands collect, analyse and activate their data across paid, earned and owned channels. The company has over 320 employees worldwide.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/3bHkbDwesiUkgLvBrfxgUo/hybrid-data-%26-cloud-technical-project-manager-(h%2Ff)-in-paris-at-fifty-five</Applyto>
      <Location>Paris</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>3487a0dd-b87</externalid>
      <Title>Associate, Data Engineer</Title>
      <Description><![CDATA[<p><strong>Associate, Data Engineer at BlackRock</strong></p>
<p>About this role</p>
<p>BlackRock is looking for a data engineer to join the Digital Data Engineering team. In this role, you will help develop data integrations between BlackRock’s internal data systems and our external marketing technology platforms. You will work with business partners to develop data structures, build ETL pipelines, and implement appropriate data governance and monitoring.</p>
<p>As part of BlackRock’s Digital organization, this role supports our mission to create AI-enabled, personalized and scalable marketing experiences. You will build the data foundations that power next generation digital platforms, audience personalization, and intelligent activation across a global ecosystem.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design and build scalable data pipelines that support AI-enabled digital experiences, personalization, and marketing automation.</li>
<li>Leverage AI-driven development and testing tools to increase engineering quality, speed, and reliability.</li>
<li>Contribute to ongoing platform modernization efforts across Martech, content, analytics, and web ecosystems.</li>
<li>Collaborate with cross-functional stakeholders to ensure data is structured and governed in ways that accelerate downstream personalization and analytics use cases.</li>
<li>Architect and develop data solutions to bring new datasets into digital ecosystem including Private Markets data and product data.</li>
</ul>
<p><strong>Core Skills</strong></p>
<ul>
<li>You have flawless written and verbal communication and ability to gain buy-in on plans from a non-technical audience</li>
<li>You have experience working with a broad set of stakeholders, including non-technical and non-quantitative people.</li>
<li>You are comfortable using AI tools to enhance development workflows, such as prototyping, testing, documentation, and data validation.</li>
<li>You have a strong desire to develop creatively and promote innovation.</li>
<li>You&#39;re self-motivated and able to think big while also taking direction and feedback.</li>
<li>You have excellent teamwork and collaboration skills.</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>3+ years’ experience in in SQL and Python, with experience in both RDBMS and Big Data structures. Existing experience with Snowflake-specific concepts is desirable.</li>
<li>Familiarity with using AI-assisted development tools (e.g., code generation, code review, unit test development) to improve quality and delivery efficiency.</li>
<li>ETL and pipeline development experience with Airflow and DBT is a plus.</li>
<li>CI/CD experience with Azure and understanding of API frameworks is a plus.</li>
<li>B.S. / M.S. degree in Computer Science, Engineering, or a related discipline.</li>
<li>Knowledge of Marketing technology platforms is desirable, but not required (e.g., Eloqua/Marketo, web analytics platforms, customer data platforms).</li>
<li>Relentless desire for understanding how processes work. Creativity in solving unconventional problems.</li>
<li>Adaptability and resiliency when overcoming challenges.</li>
</ul>
<p><strong>Our benefits</strong></p>
<p>To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment and tools designed to help you in building a sound financial future; access to education reimbursement; comprehensive resources to support your physical health and emotional well-being; family support programs; and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>
<p><strong>Our hybrid work model</strong></p>
<p>BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.</p>
<p><strong>About BlackRock</strong></p>
<p>At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Python, RDBMS, Big Data structures, Snowflake-specific concepts, AI-assisted development tools, code generation, code review, unit test development, ETL and pipeline development, Airflow, DBT, CI/CD experience, Azure, API frameworks, B.S. / M.S. degree in Computer Science, Engineering, or a related discipline, Marketing technology platforms, Eloqua/Marketo, web analytics platforms, customer data platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>BlackRock</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>BlackRock is a global investment management company that provides a range of investment products and services to institutional and individual investors.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/dgM5uMjA3xyRYgwF3u3x72/associate%2C-data-engineer-in-budapest-at-blackrock</Applyto>
      <Location>Budapest</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>bb7bb8e9-e31</externalid>
      <Title>Data Engineer - 12 Month TFT</Title>
      <Description><![CDATA[<p>We&#39;re looking for an experienced Data Engineer to join our team at Electronic Arts. As a Data Engineer, you will collaborate with the Marketing team to implement data strategies and develop complex ETL pipelines that support dashboards for promoting deeper understanding of our business.</p>
<p>You will have experience developing and establishing scalable, efficient, automated processes for large-scale data analyses. You will also stay informed of the latest trends and research on all aspects of data engineering and analytics.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Design, implement and maintain efficient, scalable and robust data pipelines using cloud-native and open-source technologies</li>
<li>Develop and optimize ETL/ELT processes to ingest, transform, and deliver data from diverse sources</li>
<li>Automate deployment and monitoring of data workflows using CI/CD best practices</li>
<li>Guide communications between our users and studio engineers to provide scalable end-to-end solutions</li>
<li>Promote strategies to improve our data modelling, quality and architecture</li>
<li>Participate in code reviews, mentor junior engineers, and contribute to team knowledge sharing</li>
</ul>
<p>Required Qualifications:</p>
<ul>
<li>4+ years relevant industry experience in a data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field</li>
<li>Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions</li>
<li>Experience in data modelling and tools such as dbt, ETL processes, and data warehousing</li>
<li>Experience with at least one of the programming languages like Python, Java</li>
<li>Experience with version control and code review tools such as Git</li>
<li>Knowledge of latest data pipeline orchestration tools such as Airflow</li>
<li>Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, CloudFormation)</li>
</ul>
<p>Nice to Have:</p>
<ul>
<li>Experience in gaming and working with its telemetry data or data from similar sources</li>
<li>Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark, Iceberg</li>
<li>Experience in developing engineering solutions based on near real-time/streaming dataset</li>
<li>Exposure to AI/ML, MLOps concepts and collaboration with data science or AI teams.</li>
</ul>
<p>Pay Transparency - North America</p>
<p>The ranges listed below are what EA in good faith expects to pay applicants for this role in these locations at the time of this posting. If you reside in a different location, a recruiter will advise on the applicable range and benefits. Pay offered will be determined based on a number of relevant business and candidate factors (e.g. education, qualifications, certifications, experience, skills, geographic location, or business needs).</p>
<p>Pay Ranges: $100,000 - $139,500 CAD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>temporary</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$100,000 - $139,500 CAD</Salaryrange>
      <Skills>SQL, cloud-based databases, data modelling, ETL processes, data warehousing, Python, Java, Git, Airflow, cloud platforms, infrastructure-as-code tools, gaming telemetry data, big data platforms, EMR, Databricks, Kafka, Spark, Iceberg, AI/ML, MLOps</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts is a leading video game developer and publisher with a portfolio of popular games and experiences.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Data-Engineer-12-month-TFT/212451</Applyto>
      <Location>Vancouver</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>03233669-78e</externalid>
      <Title>DevOps Engineer II</Title>
      <Description><![CDATA[<p>Job Title: DevOps Engineer II</p>
<p>You will be part of the DevOps team at Helpshift, responsible for creating, maintaining, scaling, and securing infrastructure used by many teams for critical workloads. The team works in various areas, including production and development infrastructure provisioning and maintenance, database infrastructure, automations, core infrastructure, security and compliance, and engineering processes.</p>
<p>Responsibilities:</p>
<ul>
<li>Design, implement, and maintain secure CI/CD pipelines for automating deployment, configuration, and testing processes.</li>
<li>Own Helpshift production services and ensure complete monitoring coverage, troubleshoot, and fix production issues.</li>
<li>Build a seamless zero-downtime process to upgrade our core infrastructure (ScyllaDB, Elasticsearch, Kafka, MongoDB, Redis).</li>
<li>Collaborate with development and operations teams to integrate security practices into the software development lifecycle.</li>
<li>Conduct regular security assessments, vulnerability scans, and penetration testing to identify and mitigate security risks.</li>
<li>Develop and maintain infrastructure as code (IaC) templates for provisioning and configuring cloud resources securely.</li>
<li>Monitor and respond to production incidents, including investigation, containment, and remediation activities.</li>
<li>Stay up-to-date with the latest security threats, vulnerabilities, and best practices, and make recommendations for continuous improvement.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Relevant experience of 5+ years and above.</li>
<li>In-depth knowledge of running/managing UNIX-like operating systems (we use Ubuntu).</li>
<li>Strong knowledge of networking protocols, security architectures, and identity and access management (IAM) principles.</li>
<li>Experience with containerisation technologies (e.g., Docker, Kubernetes) and securing containerised environments.</li>
<li>Experience in designing and building solutions that are highly scalable, fault-tolerant, and cost-effective.</li>
<li>Experience of various FOSS tools for monitoring, graphing, capacity planning, and logging.</li>
<li>Experience with IaaC tools like Ansible, Puppet, and Terraform.</li>
<li>Experience with cloud computing platforms like Amazon AWS, Google Cloud Platform, and Heroku.</li>
<li>Experience with managing NoSQL and RDBMS.</li>
<li>Experience with queuing systems (Kafka, RabbitMQ) and Big data platforms (Hadoop).</li>
<li>Good programming skills with a focus on scripting (Python, Shell, Perl).</li>
<li>Ability to analyse bottlenecks in architecture and quickly debug to reach resolution for issues.</li>
<li>Have an automation mindset and the ability to reason and work with complex systems.</li>
<li>Excellent communication and documentation skills.</li>
<li>Quick learner and good mentor for junior team members.</li>
</ul>
<p>Benefits:</p>
<ul>
<li>Hybrid setup</li>
<li>Worker&#39;s insurance</li>
<li>Paid Time Offs</li>
<li>Other employee benefits to be discussed by our Talent Acquisition team in India.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>UNIX-like operating systems, networking protocols, security architectures, identity and access management, containerisation technologies, IaaC tools, cloud computing platforms, NoSQL and RDBMS, queuing systems, Big data platforms, scripting languages</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Helpshift</Employername>
      <Employerlogo>https://logos.yubhub.co/j.com.png</Employerlogo>
      <Employerdescription>Helpshift is a software company that provides customer service and support solutions. It has a centralised DevOps team.</Employerdescription>
      <Employerwebsite>https://apply.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/8CC248A4B7</Applyto>
      <Location>Pune</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
  </jobs>
</source>