<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>5c70414d-4e6</externalid>
      <Title>Full‑Stack data engineer</Title>
      <Description><![CDATA[<p>We are seeking a highly self-sufficient, motivated engineer with strong full-stack data engineering skills to join our team. This is a remote/offshore role that requires autonomy, excellent communication, and the ability to deliver high-quality work with limited supervision while collaborating with a predominantly US-based team.</p>
<p>You will build reliable, scalable data products and user experiences that power AI/ML modeling, agentic workflows, and reporting,working end-to-end from data ingestion and transformation through to UI. Our Python-based data platform is undergoing a major evolution toward a modern, cloud-native ELT architecture. We are standardizing on Snowflake as our central data platform and dbt as our core transformation framework, implementing scalable, maintainable ELT practices that simplify ingestion, modeling, and deployment.</p>
<p>This role will be pivotal in independently designing and building robust data pipelines and semantic layers that directly power our AI and machine learning initiatives,delivering clean, reliable, and well-modeled data assets to our data science team for feature engineering, model training, and production inference. You will collaborate closely (primarily via remote channels) with data scientists and ML engineers to ensure our data ecosystem is optimized for experimentation speed, model performance, and seamless integration into downstream products and services.</p>
<p>Key Responsibilities</p>
<ul>
<li>Remote collaboration &amp; communication: Operate effectively as an offshore member of a distributed team, proactively communicating status, risks, and blockers across time zones and coordinating overlap with US working hours as needed.</li>
</ul>
<ul>
<li>Full-stack data engineering: Build across the entire stack, including data ingestion/acquisition and transformation, APIs, front-end components, and automated test suites, delivering production-grade solutions with minimal hand-holding.</li>
</ul>
<ul>
<li>Autonomous delivery &amp; ownership: Take end-to-end ownership of features and projects,clarifying requirements, breaking work into milestones, estimating timelines, and delivering high-quality, well-documented solutions.</li>
</ul>
<ul>
<li>Specification and design: Translate short- and long-term business requirements, architectural considerations, and competing timelines into clear, actionable technical specifications and design documents.</li>
</ul>
<ul>
<li>Code quality: Write clean, maintainable, efficient code that adheres to evolving standards and quality processes, including unit tests and isolated integration tests in containerized environments.</li>
</ul>
<ul>
<li>Continuous improvement: Contribute to agile practices and provide input on technical strategy, architectural decisions, and process improvements, continuously suggesting better tools, patterns, and automation.</li>
</ul>
<p>Required Skills &amp; Experience</p>
<ul>
<li>Professional experience: 5+ years in software engineering, with a full-stack background building complex, scalable data-engineering pipelines using data warehouse technology, SQL with dbt, Python, AWS with Terraform, and modern UI technologies.</li>
</ul>
<ul>
<li>Modern data engineering: Strong experience with medallion data architecture patterns using data warehouse technologies (e.g., Snowflake), data transformation tooling (e.g., dbt), BI tooling, and NoSQL data marts (e.g., Elasticsearch/OpenSearch).</li>
</ul>
<ul>
<li>Testing and QA: Solid understanding of unit testing, CI/CD automation, and quality assurance processes for both data pipeline testing and operational data quality tests.</li>
</ul>
<ul>
<li>Remote work &amp; autonomy: Proven track record working in a remote or distributed environment, demonstrating self-motivation, reliable execution, and the ability to make sound technical decisions independently.</li>
</ul>
<ul>
<li>Agile methodology: Working knowledge of Agile development practices and workflows (e.g., sprint planning, stand-ups, retrospectives) in a distributed team setting.</li>
</ul>
<ul>
<li>Education: Bachelor’s or Master’s degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.</li>
</ul>
<p>Preferred Skills &amp; Experience</p>
<ul>
<li>Machine learning and AI: Hands-on experience with large language models (LLMs) and agentic frameworks/workflows.</li>
</ul>
<ul>
<li>Search and analytics: Familiarity with the ELK stack (Elasticsearch, Logstash, Kibana) for search and analytics solutions.</li>
</ul>
<ul>
<li>Cloud expertise: Experience with AWS cloud services; familiarity with SageMaker; and CI/CD tooling such as GitHub Actions or Jenkins.</li>
</ul>
<ul>
<li>Front-end expertise: Experience building user interfaces with Angular or a modern UI stack.</li>
</ul>
<ul>
<li>Financial domain knowledge: Broad understanding of equities, fixed income, derivatives, futures, FX, and other financial instruments.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, Snowflake, dbt, AWS, Terraform, modern UI technologies, data warehouse technology, SQL, unit testing, CI/CD automation, quality assurance processes, machine learning, AI, large language models, agentic frameworks, ELK stack, search and analytics solutions, cloud expertise, AWS cloud services, SageMaker, CI/CD tooling, front-end expertise, Angular, financial domain knowledge</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>FIC &amp; Risk Technology</Employername>
      <Employerlogo>https://logos.yubhub.co/mlp.eightfold.ai.png</Employerlogo>
      <Employerdescription>FIC &amp; Risk Technology is a technology company that provides risk management solutions.</Employerdescription>
      <Employerwebsite>https://mlp.eightfold.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://mlp.eightfold.ai/careers/job/755955321460</Applyto>
      <Location>Bangalore, Karnataka, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1bebb6dc-380</externalid>
      <Title>Staff Software Engineer, Platform</Title>
      <Description><![CDATA[<p>We live in unprecedented times – AI has the potential to exponentially augment human intelligence. As the world adjusts to this new reality, leading platform companies are scrambling to build LLMs at billion scale, while large enterprises figure out how to add it to their products.</p>
<p>At Scale, our products include the Generative AI Data Engine, SGP, Donovan, and others that power the most advanced LLMs and generative models in the world through world-class RLHF, human data generation, model evaluation, safety, and alignment.</p>
<p>As a Staff Software Engineer, you will define and drive both the architectural roadmap and implementation of core platforms and software systems. You will be responsible for providing high-level vision and driving adoption across the engineering org for orchestration, data abstraction, data pipelines, identity &amp; access management, and underlying cloud infrastructure.</p>
<p>Impact and Responsibilities:</p>
<ul>
<li>Architectural Vision: You will drive the design and implementation of foundational systems, acting as a bridge between high-level business goals and technical goals.</li>
</ul>
<ul>
<li>Cross-Functional Leadership: You will collaborate with cross-functional teams to define and drive adoption of the next generation of features for our AI data infrastructure.</li>
</ul>
<ul>
<li>Technical Ownership: You are responsible for proactively identifying and driving opportunities for organizational growth, driving improvements in programming practices, and upgrading the tools that define our development lifecycle.</li>
</ul>
<ul>
<li>Technical Mentorship: You will serve as a subject matter expert, presenting technical information to stakeholders and providing the guidance to elevate the engineering culture across the company.</li>
</ul>
<p>Ideally you’d have:</p>
<ul>
<li>8+ years of full-time engineering experience, post-graduation with specialities in back-end systems.</li>
</ul>
<ul>
<li>Extensive experience in software development and a deep understanding of distributed systems and public cloud platforms (AWS preferred).</li>
</ul>
<ul>
<li>Demonstrated a track record of independent ownership and leadership across successful multi-team engineering projects.</li>
</ul>
<ul>
<li>Possess excellent communication and collaboration skills, and the ability to translate complex technical concepts to non-technical stakeholders.</li>
</ul>
<ul>
<li>Experience working fluently with standard containerization &amp; deployment technologies like Kubernetes, Terraform, Docker, etc.</li>
</ul>
<ul>
<li>Experience with orchestration platforms, such as Temporal and AWS Step Functions.</li>
</ul>
<ul>
<li>Experience with NoSQL document databases (MongoDB) and structured databases (Postgres).</li>
</ul>
<ul>
<li>Strong knowledge of software engineering best practices and CI/CD tooling (CircleCI, ArgoCD).</li>
</ul>
<p>Nice to haves:</p>
<ul>
<li>Experience with data warehouses (Snowflake, Firebolt) and data pipeline/ETL tools (Dagster, dbt).</li>
</ul>
<ul>
<li>Experience scaling products at hyper-growth startups.</li>
</ul>
<ul>
<li>Excitement to work with AI technologies.</li>
</ul>
<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training.</p>
<p>For pay transparency purposes, the base salary range for this full-time position in the locations of San Francisco, New York, Seattle is: $252,000-$315,000 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$252,000-$315,000 USD</Salaryrange>
      <Skills>Software development, Distributed systems, Public cloud platforms, Containerization &amp; deployment technologies, Orchestration platforms, NoSQL document databases, Structured databases, Software engineering best practices, CI/CD tooling, Data warehouses, Data pipeline/ETL tools, Scaling products at hyper-growth startups, AI technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale develops reliable AI systems for the world&apos;s most important decisions, providing high-quality data and full-stack technologies that power leading models.</Employerdescription>
      <Employerwebsite>https://scale.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4649893005</Applyto>
      <Location>San Francisco, CA; New York, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>81b2e2ee-c36</externalid>
      <Title>Manager, Solutions Engineering</Title>
      <Description><![CDATA[<p>About Mixpanel</p>
<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>
<p>As a Manager on the Solutions Engineering team at Mixpanel, you will lead a talented group of analytics consultants who are pivotal to our success. You will be at the forefront of driving customer value, guiding your team as they serve as the primary technical resources for our Sales organisation.</p>
<p>Responsibilities</p>
<ul>
<li>Develop &amp; Mentor: Lead, coach, and grow a high-performing and inclusive team of Solutions Engineers, actively investing in their career development and upholding a high standard of performance.</li>
</ul>
<ul>
<li>Drive Results: Partner closely with Sales leadership and Account Executives to provide technical expertise that drives new, retained, and expansion ARR. You will ensure your team&#39;s activities are directly contributing to the company&#39;s bottom line.</li>
</ul>
<ul>
<li>Prioritise &amp; Problem Solve: Guide your team through complex customer evaluations and technical challenges. You will manage team resources effectively, aligning the right skills to customer needs to achieve productivity targets and successful outcomes.</li>
</ul>
<ul>
<li>Cross-Functional Partnership: Act as a key technical liaison, collaborating with peer managers across Sales, Product, and Engineering. You will gather and synthesise customer feedback from your team to influence product strategy and solve problems at scale.</li>
</ul>
<ul>
<li>Communicate &amp; Manage Change: Effectively translate broader company and departmental strategy into clear, actionable goals for your team. You will guide your direct reports through evolving business priorities with empathy and clarity.</li>
</ul>
<ul>
<li>Hire the Best: Actively assess the needs of the team, build a pipeline of top talent, and hire outstanding individuals who elevate the team&#39;s capabilities and contribute to our inclusive culture.</li>
</ul>
<ul>
<li>Innovate &amp; Raise the Bar: Relentlessly seek to improve how your team operates, from refining demo strategies and proof-of-concept methodologies to adopting new tools and processes that increase effectiveness and celebrate success.</li>
</ul>
<p>We&#39;re Looking For Someone Who</p>
<ul>
<li>Has progressive experience in a B2B SaaS environment, including 3+ years of people management experience leading a technical pre-sales, solutions engineering, or professional services team.</li>
</ul>
<ul>
<li>Exhibits a &#39;player-coach&#39; mentality with deep knowledge in the data and analytics space. You are an expert on how data products (like CDPs, data warehouses, and analytics tools) are implemented and adopted by customers.</li>
</ul>
<ul>
<li>Is a proven cross-functional partner with a track record of successfully working with sales teams to navigate complex deals and drive revenue.</li>
</ul>
<ul>
<li>Demonstrates expertise in communicating complex technical concepts clearly and effectively to both technical and non-technical stakeholders.</li>
</ul>
<ul>
<li>Is skilled at prioritising team activities and managing workload in a dynamic environment, balancing customer needs with efficiency goals.</li>
</ul>
<ul>
<li>Is a natural mentor and developer of talent, with a passion for coaching and a history of building inclusive, high-achieving teams.</li>
</ul>
<ul>
<li>Handles ambiguity with ease, demonstrating flexibility and a proactive, problem-solving mindset when adapting to new challenges and business priorities.</li>
</ul>
<ul>
<li>Actively seeks feedback and is humble to learn, consistently looking for ways to improve themselves and their team.</li>
</ul>
<p>Bonus Points</p>
<ul>
<li>Previous experience in management consulting, strategic operations, or a similar role focused on go-to-market strategy.</li>
</ul>
<ul>
<li>Direct, hands-on experience with Mixpanel or other product analytics tools like Amplitude, Pendo, or Contentsquare.</li>
</ul>
<ul>
<li>Strong familiarity with the modern data stack, including tools like Snowflake, Google BigQuery, Segment, or Hightouch.</li>
</ul>
<p>#LI-Hybrid</p>
<p>Compensation</p>
<p>The amount listed below is the total target cash compensation (TTCC) and includes base compensation and variable compensation in the form of either a company bonus or commissions. Variable compensation type is determined by your role and level. In addition to the cash compensation provided, this position is also eligible for equity consideration and other benefits including medical, vision, and dental insurance coverage.</p>
<p>Our salary ranges are determined by role and level and are benchmarked to the SF Bay Area Technology data cut released by Radford, a global compensation database. The range displayed represents the minimum and maximum TTCC for new hire salaries for the position across all of our US locations. To stay on top of market conditions, we refresh our salary ranges twice a year so these ranges may change in the future. Within the range, individual pay is determined by experience, job-related skills, qualifications, and other factors.</p>
<p>If you have questions about the specific range, your recruiter can share this information.</p>
<p>Mixpanel Compensation Range $238,300-$321,705 USD</p>
<p>Benefits and Perks</p>
<ul>
<li>Comprehensive Medical, Vision, and Dental Care</li>
</ul>
<ul>
<li>Mental Wellness Benefit</li>
</ul>
<ul>
<li>Generous Vacation Policy &amp; Additional Company Holidays</li>
</ul>
<ul>
<li>Enhanced Parental Leave</li>
</ul>
<ul>
<li>Volunteer Time Off</li>
</ul>
<ul>
<li>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</li>
</ul>
<p>Culture Values</p>
<ul>
<li>Make Bold Bets: We choose courageous action over comfortable progress.</li>
</ul>
<ul>
<li>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</li>
</ul>
<ul>
<li>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</li>
</ul>
<ul>
<li>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</li>
</ul>
<ul>
<li>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</li>
</ul>
<ul>
<li>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</li>
</ul>
<p>Why choose Mixpanel?</p>
<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital. Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviours and easily track overarching company success metrics.</p>
<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service. Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>
<p>Mixpanel is an equal opportunity employer supporting workforce diversity. At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most important thing.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$238,300-$321,705 USD</Salaryrange>
      <Skills>data and analytics space, data products, CDPs, data warehouses, analytics tools, complex technical concepts, team activities, workload management, customer needs, efficiency goals, ambiguity, flexibility, problem-solving mindset, product analytics tools, Amplitude, Pendo, Contentsquare, modern data stack, Snowflake, Google BigQuery, Segment, Hightouch</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a leader in analytics with over 29,000 companies using its platform, including Workday, Pinterest, LG, and Rakuten Viber.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7513876</Applyto>
      <Location>San Francisco, US (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>477d343e-e37</externalid>
      <Title>Customer Success Architect</Title>
      <Description><![CDATA[<p>About Mixpanel</p>
<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence. Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees.</p>
<p>About the Customer Success Team:</p>
<p>Mixpanel’s Customer Success &amp; Solutions Engineering teams are analytics consultants who embed themselves within our enterprise customer teams to drive our customers’ business outcomes. We work with prospects and customers throughout the customer journey to understand what drives value and serve as the technical counterpart to our Sales organization to deliver on that value.</p>
<p>You will partner closely with Account Executives, Account Managers, Product, Engineering, and Support to successfully roll out self-serve analytics within our customers’ organizations, help the customer manage change, execute on technical projects and services that delight our customers, and ultimately drive ROI on the customer’s Mixpanel investment.</p>
<p>About the Role:</p>
<p>As a CSA, you will partner with customers throughout the customer journey to understand what drives value, beginning from the pre-sales running proof of concepts to demonstrate quick time to value, to post-sales onboarding and implementation, where you set customers up for long-term success with scalable implementation and data governance best practices. Throughout the entire customer lifecycle, you will work to understand how analytics can drive business value for your customers and will consult them on how to maximize the value of Mixpanel, including managing change during Mixpanel’s rollout, defining and achieving ROI, and identifying areas of improvement in their current usage of analytics.</p>
<p>For large enterprise customers, post onboarding, you will also continue alongside the Account Managers to drive data trust and product adoption for 100+ end user teams through a change management rollout approach.</p>
<p>Responsibilities:</p>
<p>Serve as a trusted technical advisor for prospects/customers to provide strategic consultation on data architecture, governance, instrumentation, and business outcomes</p>
<p>Effectively communicate at most levels of the customer’s organization to influence business outcomes via Mixpanel, design and execute a comprehensive analytics strategy, and unblock technical and organizational roadblocks</p>
<p>Own the customer’s success with Mixpanel , documenting and delivering ROI to the customer throughout their journey to transform their business with self-serve analytics</p>
<p>Own onboarding and data health for your assigned customers/projects, including ongoing enhancements to their data quality and overall tech stack integration</p>
<p>Engage with customers’ engineering, product management, and marketing teams to handle technical onboarding, optimize Mixpanel deployments, and improve data trust</p>
<p>Deliver a variety of technical services ranging from data architecture consultations to adoption and change management best practices</p>
<p>Leverage modern data architecture expertise to create scalable data governance practices and data trust for our customers, including data optimization and re-implementation projects</p>
<p>Successfully execute on success outcomes whilst balancing project timelines, scope creep, and unanticipated issues</p>
<p>Bridge the technical-business gap with your customers , working with business stakeholders to define a strategic vision for Mixpanel and then working with the right business and technical contacts to execute that vision</p>
<p>Collaborate with our technical and solutions partners as needed on data optimization and onboarding projects</p>
<p>Be a technical sponsor for internal engagements with Mixpanel product and engineering teams to prioritize product and systems tasks from clients</p>
<p>We&#39;re Looking For Someone Who Has</p>
<p>3 to 5 years of experience consulting on defining and delivering ROI through new tool implementations</p>
<p>Experience working with Director-level members of the customer organization to define a strategic vision and successfully leveraging those members to deliver on that vision</p>
<p>The ability to communicate with stakeholders at most levels of an organization , from talking with developers about the ins and outs of an API to talking to a Director of Data Science/Product Management about organizational efficiency</p>
<p>Can manage complex projects with assorted client stakeholders, working across teams and departments to execute real change</p>
<p>Has a demonstrated successful record of experience in customer success, client-facing professional services, consulting, or technical project management role</p>
<p>Excellent written, analytical, and communication skills</p>
<p>Strong process and/or project delivery discipline</p>
<p>Eager to learn new technologies and adapt to evolving customer needs</p>
<p>We&#39;d Be Extra Excited For Someone Who Has</p>
<p>Experience in data querying, modeling, and transforming in at least one core tool, including SQL / dbt / Python / Business Intelligence tools / Product Analytics tools, etc.</p>
<p>Familiar with databases and cloud data warehouses like Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, etc.</p>
<p>Familiar with product analytics implementation methods like SDKs, Customer Data Platforms (CDPs), Event Streaming, Reverse ETL, etc.</p>
<p>Familiar with analytics best practices across business segments and verticals</p>
<p>Benefits and Perks</p>
<p>Comprehensive Medical, Vision, and Dental Care</p>
<p>Mental Wellness Benefit</p>
<p>Generous Vacation Policy &amp; Additional Company Holidays</p>
<p>Enhanced Parental Leave</p>
<p>Volunteer Time Off</p>
<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>
<p>Culture Values</p>
<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>
<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>
<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>
<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>
<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>
<p>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</p>
<p>Why choose Mixpanel?</p>
<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>
<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>
<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>
<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>
<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>
<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>
<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>
<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, or any other protected characteristic.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data architecture, governance, instrumentation, business outcomes, data querying, modeling, transforming, SQL, dbt, Python, Business Intelligence tools, Product Analytics tools, databases, cloud data warehouses, Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, SDKs, Customer Data Platforms, Event Streaming, Reverse ETL</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a leading provider of digital analytics software, serving over 29,000 companies worldwide.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7506821</Applyto>
      <Location>Bengaluru, India (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b3cf0ff9-4c6</externalid>
      <Title>Support Engineer II</Title>
      <Description><![CDATA[<p>About Mixpanel</p>
<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>
<p>Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees. Mixpanel delivers insights that customers trust.</p>
<p>Visit mixpanel.com to learn more.</p>
<p>About The Support Team</p>
<p>Mixpanel Support is a team of talented problem-solvers from diverse backgrounds. We care deeply about helping our customers be successful and enabling them to get value from their data.</p>
<p>We are located all over the world in San Francisco, Barcelona, London, and Singapore...</p>
<p>About The Role</p>
<p>The right candidate is an avid learner, an advocate for customers, and a collaborative teammate. The main responsibility of a Support Engineer is to help users solve technical challenges and use Mixpanel to make impactful product decisions.</p>
<p>We’ve had team members focus on developing their technical skills to join the product and engineering teams, hone their customer-facing skills to become customer success managers or sales engineers, and take on leadership roles in the Support organization.</p>
<p>Responsibilities</p>
<p>The core responsibility of a Support Engineer is to support our customers at every turn in the Mixpanel journey by providing answers to product questions, sharing best practices, and debugging technical issues.</p>
<p>You&#39;ll also develop your technical skills, collaborate with our Product team to improve our product, learn product analytics, and mentor new team members.</p>
<p>Become a Mixpanel product expert - you will help users understand our reports and features, help them use our APIs and SDKs, share best practices, and resolve account issues</p>
<p>Respond to customer inquiries via Zendesk email, chat, Slack, and phone calls</p>
<p>Investigate and document bugs and feature requests to share with our Product and Engineering teams</p>
<p>Provide feedback regarding internal support processes, product functionality, and customer education resources to improve the customer experience</p>
<p>Shape the product by regularly working closely with PM’s, engineers, and designers to incorporate customer learnings into change</p>
<p>We&#39;re Looking For Someone Who Has</p>
<p>Experience providing customer facing SAAS support (in customer support, professional services, technical account management or similar)</p>
<p>Ability to communicate technical concepts effectively in a clear, friendly writing style</p>
<p>Excellent problem-solving and analytical skills</p>
<p>Programming experience, understanding of web &amp; mobile technologies, and interacting with APIs</p>
<p>Experience with debugging and collaborating with engineering to resolve complex technical issues, especially with JavaScript, Python, or mobile technologies</p>
<p>Ability to be resourceful and resilient when faced with ambiguity and new challenges</p>
<p>Dedication to developing expertise in a complex and constantly evolving product</p>
<p>Interest and aptitude to develop technical skills and learn new technologies</p>
<p>Experience providing SLA based support and/or dedicated support to strategic customers</p>
<p>Speak Hebrew and fluent English</p>
<p>Bonus Points</p>
<p>Experience with Mixpanel or other analytics tools</p>
<p>Familiar with databases and cloud data warehouses like Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, etc.</p>
<p>Familiar with product analytics implementation methods like SDKs, Customer Data Platforms (CDPs), Event Streaming, Reverse ETL, etc.</p>
<p>Benefits and Perks</p>
<p>Comprehensive Medical, Vision, and Dental Care</p>
<p>Mental Wellness Benefit</p>
<p>Generous Vacation Policy &amp; Additional Company Holidays</p>
<p>Enhanced Parental Leave</p>
<p>Volunteer Time Off</p>
<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>
<p>Culture Values</p>
<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>
<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>
<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>
<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>
<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>
<p>Why choose Mixpanel?</p>
<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>
<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>
<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>
<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>
<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>
<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>
<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>
<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, veteran status, or disability status.</p>
<p>Pursuant to the San Francisco Fair Chance Ordinance or other similar laws that may be applicable, we will consider for employment qualified applicants with arrest and conviction records.</p>
<p>We’ve immersed ourselves in our Culture and Values as our guiding principles for the impact we want to have and the future we are building.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel></Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>customer facing SAAS support, technical concepts, problem-solving, programming experience, web &amp; mobile technologies, APIs, debugging, collaboration, SLA based support, dedicated support, Hebrew, English, Mixpanel, analytics tools, databases, cloud data warehouses, product analytics implementation methods, SDKs, Customer Data Platforms, Event Streaming, Reverse ETL</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a digital analytics platform that helps teams accelerate adoption, improve retention, and ship with confidence. It has over 29,000 customers, including Workday, Pinterest, LG, and Rakuten Viber.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7650541</Applyto>
      <Location>Tel Aviv, Israel (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1869fa15-51d</externalid>
      <Title>Software Engineer, Platform</Title>
      <Description><![CDATA[<p>We&#39;re looking for a skilled Software Engineer to join our Platform Engineering team. As a key member of our team, you will support the design and development of shared platforms used across Scale. This includes designing our foundational data platforms and lifecycle, architecting Scale&#39;s core cloud infrastructure and orchestration stack, and redefining how engineers develop, build, test, and deploy software at Scale.</p>
<p>You will drive the design, and implementation of our foundational platforms and systems, working closely with stakeholders and internal customers to understand and refine requirements. You&#39;ll collaborate with cross-functional teams to define, design, and deliver new features. You&#39;ll also proactively identify opportunities for, and drive improvements to, current programming practices, including process enhancements and tool upgrades.</p>
<p>Ideally, you&#39;d have 3+ years of full-time engineering experience, post-graduation with specialities in back-end systems. You should have extensive experience in software development and a deep understanding of distributed systems and public cloud platforms (AWS preferred). You should show a track record of independent ownership of successful engineering projects. You should possess excellent communication and collaboration skills, and the ability to translate complex technical concepts to non-technical stakeholders.</p>
<p>You should have experience working fluently with standard containerization &amp; deployment technologies like Kubernetes, Terraform, Docker, etc. You should have experience with orchestration platforms, such as Temporal and AWS Step Functions. You should have experience with NoSQL document databases (MongoDB) and structured databases (Postgres). You should have strong knowledge of software engineering best practices and CI/CD tooling (CircleCI).</p>
<p>Nice to haves include experience with data warehouses (Snowflake, Firebolt) and data pipeline/ETL tools (Dagster, dbt). Experience with authentication/authorization systems (Zanzibar, Authz, etc.) is also a plus. Experience scaling products at hyper-growth startups is highly valued. Excitement to work with AI technologies is a must.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,000-$225,000 USD</Salaryrange>
      <Skills>software development, distributed systems, public cloud platforms, containerization &amp; deployment technologies, orchestration platforms, NoSQL document databases, structured databases, software engineering best practices, CI/CD tooling, data warehouses, data pipeline/ETL tools, authentication/authorization systems, scaling products at hyper-growth startups, AI technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale develops reliable AI systems for the world&apos;s most important decisions.</Employerdescription>
      <Employerwebsite>https://scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4594879005</Applyto>
      <Location>San Francisco, CA; New York, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>aeba45bc-3e4</externalid>
      <Title>Senior Solutions Engineer</Title>
      <Description><![CDATA[<p>About Mixpanel</p>
<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>
<p>Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees. Mixpanel delivers insights that customers trust.</p>
<p>Visit mixpanel.com to learn more.</p>
<p>About the Customer Success &amp; Solutions Engineering Team</p>
<p>Mixpanel’s Customer Success &amp; Solutions Engineering teams are analytics consultants who embed themselves within our enterprise customer teams to drive our customer’s business outcomes. We work with prospects and customers throughout the customer journey to understand what drives value and serve as the technical counterpart to our Sales organization to deliver on that value.</p>
<p>You will partner closely with Account Executives, Account Managers, Product, Engineering, and Support to successfully roll out self-serve analytics within our customer’s organizations, help the customer manage change, execute on technical projects and services that delight our customers and ultimately drive ROI on the customer’s Mixpanel investment.</p>
<p>About the Role</p>
<p>Our SEs are inquisitive, nimble, and able to clearly articulate the technical benefits and requirements of Mixpanel to developers and product managers, while also communicating the business value of our product to high-level executives. In your first month, you’ll become a Mixpanel expert,both in features and functionality as well as implementation. You’ll have the opportunity to shadow customer calls and demos with current Sales Engineers and Account Executives while learning to articulate our value proposition. You’ll also be trained on Mixpanel’s internal systems and tools to set you up for success.</p>
<p>Within your first three months, you’ll be directly involved in deal cycles with Commercial Account Executives. You’ll lead the technical qualification for customer use cases and deliver customized demos for prospects. You’ll work directly with leadership at the prospect’s organization to understand business challenges that can be solved through an analytics platform and consult on how Mixpanel can address those challenges to achieve a strong ROI. You’ll also work with the prospect’s business and technical teams to scope and execute proof-of-concept projects to establish Mixpanel’s value,including consulting on data ingestion methods, overall architecture, success criteria, and rollout strategies for analytics tools across an organization.</p>
<p>Responsibilities</p>
<p>Serve as a trusted technical advisor for prospects, providing strategic consultation on data architecture, governance, instrumentation, and business outcomes.</p>
<p>Communicate and consult effectively at all levels of the customer’s organization to earn trust and influence buying decisions.</p>
<p>Bridge the technical-business gap,working with senior stakeholders to define success for proof-of-concepts and ensuring successful execution and outcomes.</p>
<p>Leverage your Mixpanel expertise and technical/consultative skills to impart best practices throughout proof-of-concept projects.</p>
<p>Partner with Account Executives to drive revenue growth, serving as the key technical contact for customers.</p>
<p>Partner with post-sales teams to ensure that pre-sales value propositions translate into tangible post-sales results.</p>
<p>Develop relationships and uncover the needs of key technical stakeholders within your assigned book of business.</p>
<p>Be the “Voice of the Prospect” by collecting feedback from potential Mixpanel customers and sharing it with the Product team.</p>
<p>We&#39;re Looking For Someone Who Has</p>
<p>The ability to communicate with stakeholders at all levels,from discussing APIs with developers to organizational efficiency with CIOs.</p>
<p>A demonstrated track record of qualifying and selling technical solutions to executive stakeholders.</p>
<p>6+ years of experience in a Software-as-a-Service Sales Engineering or related role.</p>
<p>Experience in data querying, modeling, and transformation using tools such as SQL, dbt, Python, Business Intelligence platforms, or Product Analytics tools.</p>
<p>Familiarity with databases and cloud data warehouses (e.g., Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks).</p>
<p>A successful record of experience in sales engineering, customer success, client-facing professional services, consulting, or technical project management.</p>
<p>Excellent written, analytical, communication, and presentation skills.</p>
<p>Strong process and project delivery discipline.</p>
<p>The ability to travel.</p>
<p>Fluency in multiple languages; German preferred.</p>
<p>Benefits and Perks</p>
<p>Comprehensive Medical, Vision, and Dental Care</p>
<p>Mental Wellness Benefit</p>
<p>Generous Vacation Policy &amp; Additional Company Holidays</p>
<p>Enhanced Parental Leave</p>
<p>Volunteer Time Off</p>
<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>
<p>Culture Values</p>
<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>
<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>
<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>
<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>
<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>
<p>Why choose Mixpanel?</p>
<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>
<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>
<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>
<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>
<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>
<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>
<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>
<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, veteran status, or disability status.</p>
<p>Pursuant to the San Francisco Fair Chance Ordinance or other similar laws that may be applicable, we will consider for employment qualified applicants with arrest and conviction records.</p>
<p>We’ve immersed ourselves in our Culture and Values as our guiding principles for the impact we want to have and the future we are building.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, dbt, Python, Business Intelligence platforms, Product Analytics tools, Databases, Cloud data warehouses, Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a software company that provides a digital analytics platform. It has over 29,000 customers and has raised $277M from prominent investors.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7407407</Applyto>
      <Location>London, UK (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>844d7046-2a6</externalid>
      <Title>Salesforce Administrator</Title>
      <Description><![CDATA[<p>As a Salesforce Administrator at Anthropic, you will play a key role in building and maintaining world-class CRM systems for our GTM team. In this role, you&#39;ll work both autonomously and collaboratively to design and implement tactical and project-related deliverables that support our business growth. You&#39;ll work closely with other Revenue Operations team members and business partners to understand requirements and drive strategic solutions, always with a focus on flexibility, maintainability, and scale.</p>
<p>Responsibilities:</p>
<p>Configure and customize Salesforce to meet business needs Optimize QTC solutions with Salesforce and other systems (Ironclad, Stripe and Metronome) Design and develop high quality solutions in an agile environment including partner management, forecasting, territory mapping, internationalization, pipeline optimization, and lead management processes Work closely with operations and business partners to understand and refine requirements, objectives, and processes Build for scale by designing holistically, with a focus on flexibility and maintainability Ensure appropriate controls and documentation are followed to create an effective control environment Deal with ambiguity in a rapidly changing business environment, resolve problems and offer impacting solutions Configure and maintain other business systems like Hubspot, Apollo, and Clay Build and optimize integrations between Salesforce and other platforms</p>
<p>You may be a good fit if you:</p>
<p>Have 3+ years of Salesforce admin experience Have experience implementing and maintaining QTC/CPQ solutions Hold Salesforce Basic and Advanced Certifications (Advanced Administrator certification, Advanced Developer or Platform II certification) Have in-depth understanding of the capabilities and constraints of the Salesforce CRM application architecture Have experience with data migration and integrating Salesforce with other platforms/services Have worked with financial systems such as Stripe and Netsuite (Metronome a plus) Have basic SQL skills and experience working with data warehouses Have proven experience scaling Salesforce implementations in high-growth environments Have experience working closely with and implementing solutions for a fast-growing GTM organization.</p>
<p>Possess excellent analytical skills, combined with impeccable business judgment Can communicate effectively with management, sales, marketing, vendors and international teams</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$190,000-$270,000 USD</Salaryrange>
      <Skills>Salesforce, QTC/CPQ solutions, Ironclad, Stripe, Metronome, Hubspot, Apollo, Clay, SQL, Data warehouses</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a public benefit corporation that creates reliable, interpretable, and steerable AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5075070008</Applyto>
      <Location>San Francisco, CA | New York City, NY | Seattle, WA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>df625390-342</externalid>
      <Title>AI Analytics Engineer (Marketing Analytics)</Title>
      <Description><![CDATA[<p>We&#39;re seeking an AI Analytics Engineer to join our Data Science &amp; Analytics team. As a high-impact, early-career role, you will be responsible for building the canonical data infrastructure, owning critical dashboards, and enabling Marketing stakeholders to execute faster, more confident, data-driven decisions.</p>
<p>You will design and maintain trustworthy data models for core marketing metrics, manage the full lifecycle from prototyping through production, and develop and govern dbt data pipelines. You will also build and optimize dashboards that deliver real-time, self-serve insights across high-priority marketing areas, drive data independence for Marketing stakeholders, and collaborate with the Marketing team and data partners to establish the AI Business Context layer for marketing use cases.</p>
<p>You will serve as the primary data partner for marketing managers, demand generation teams, and leadership, translating complex data insights into clear business recommendations via dashboards, memos, and presentations. You will achieve a comprehensive mastery of Airtable&#39;s marketing data models, existing pipelines, and BI tools within the first 6 months, becoming the definitive internal expert.</p>
<p>This is a genuinely AI-native role, requiring active, demonstrated daily use of AI coding tools such as Cursor, Claude, ChatGPT, and Gemini. You must provide specific, concrete examples of how these tools are integral to your work, moving beyond simple familiarity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Expert-level SQL, Proficiency with dbt or equivalent data transformation tools, Experience with BI and visualization platforms (Looker, Omni, Tableau, Hex, or similar), Active, demonstrated daily use of AI coding tools (Cursor, Claude, ChatGPT, Gemini), Mandatory use of GitHub for version control in a standard development workflow, Python for data work (pandas, ETL scripting, or analysis), Prior exposure to marketing data concepts: attribution, funnel metrics, lead scoring, or campaign performance, Familiarity with CRM (Salesforce) or marketing automation platforms (Marketo), Experience with Databricks or cloud data warehouses, A public portfolio showcasing data or AI-assisted engineering work (GitHub, personal projects, Kaggle)</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Airtable</Employername>
      <Employerlogo>https://logos.yubhub.co/airtable.com.png</Employerlogo>
      <Employerdescription>Airtable is a no-code app platform that empowers people to accelerate their most critical business processes.</Employerdescription>
      <Employerwebsite>https://airtable.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/airtable/jobs/8434307002</Applyto>
      <Location>San Francisco, CA; Austin, TX; New York, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>84875ccd-0a5</externalid>
      <Title>Senior Partner Marketing Manager</Title>
      <Description><![CDATA[<p>As a Senior Partner Marketing Manager, you&#39;ll play a critical role in shaping and executing co-marketing initiatives with some of our most important technology partners globally.</p>
<p>Your work will amplify the reach and impact of dbt across the modern data stack, helping to elevate our brand and drive growth through the ecosystem.</p>
<p>This is a highly cross-functional role where strategic thinking, creativity, and strong collaboration skills will be key to success.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Lead the creation and execution of marketing campaigns, programs, events, and activities with strategic technology partners.</li>
<li>Collaborate closely with the Revenue Marketing, Partnerships, and Product Marketing teams to ensure GTM partner plans align with dbt Labs&#39; broader business goals.</li>
<li>Build and nurture relationships with marketing counterparts at key partners like Snowflake, Google, AWS, Microsoft, and Databricks to align on co-marketing efforts and shared objectives.</li>
<li>Clearly articulate the value of dbt to partners and support them in promoting the platform internally and to their customer base.</li>
<li>Own the development of joint messaging and co-branded assets,including blogs, webinars, solution briefs, and presentation decks,ensuring alignment and consistency across all public-facing content.</li>
<li>Create internal enablement materials to educate and empower sales teams to leverage partner campaigns and initiatives.</li>
<li>Gain a deep understanding of partner business strategies and priorities; design co-marketing programs that provide mutual value.</li>
<li>Set and manage OKRs, track program performance, and deliver quarterly reviews with partners to assess impact, identify opportunities, and ensure strategic alignment.</li>
<li>Develop annual GTM marketing plans tailored to individual partners, accounting for geographic and vertical-specific nuances.</li>
<li>Exercise strategic judgment in deciding which partner activities to pursue and how best to allocate time and resources for maximum impact.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>8+ years of experience in B2B marketing or similar, particularly within the data or software industry, with a strong track record of building and executing successful partner marketing programs.</li>
<li>A &#39;builder&#39; mindset: you enjoy solving problems, creating structure where none exists, and working cross-functionally to drive measurable outcomes.</li>
<li>Deep familiarity with the modern data ecosystem and players such as Snowflake, Google, AWS, Microsoft, and Databricks.</li>
<li>The ability to navigate complex partner organizations and manage relationships with multiple stakeholders across competing interests.</li>
<li>Strong storytelling and positioning skills,you know how to distill joint value propositions into compelling messaging and content.</li>
<li>Comfort operating in a fast-paced, dynamic environment with high levels of ambiguity.</li>
<li>A broad understanding of integrated marketing strategies, including digital campaigns, field marketing, and industry events.</li>
<li>Exceptional communication skills, including concise writing and confident presentation abilities, especially with senior stakeholders.</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Experience working asynchronously within a remote, distributed team.</li>
<li>Prior experience working for or closely with any of dbt Labs&#39; strategic partners.</li>
<li>Familiarity with the role dbt Labs plays in the cloud data warehouse ecosystem and the modern data stack.</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Unlimited vacation time with a culture that actively encourages time off</li>
<li>401k plan with 3% guaranteed company contribution</li>
<li>Comprehensive healthcare coverage</li>
<li>Generous paid parental leave</li>
<li>Flexible stipends for:</li>
<li>Health &amp; Wellness</li>
<li>Home Office Setup</li>
<li>Cell Phone &amp; Internet</li>
<li>Learning &amp; Development</li>
<li>Office Space</li>
</ul>
<p><strong>Compensation</strong></p>
<p>We offer competitive compensation packages commensurate with experience, including salary, RSUs, and where applicable, performance-based pay. Our Talent Acquisition Team can answer questions around dbt Labs total rewards during your interview process.</p>
<p>In select locations (including Austin, Boston, Chicago, Denver, Los Angeles, Philadelphia, New York City, San Francisco, Washington, DC, and Seattle), an alternate range may apply, as specified below.</p>
<ul>
<li>The typical starting salary range for this role is: $132,000-$188,700</li>
<li>The typical starting salary range for this role in the select locations listed is: $147,000-209,000</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$132,000-$188,700</Salaryrange>
      <Skills>B2B marketing, Partner marketing, Digital campaigns, Field marketing, Industry events, Cloud data warehouse ecosystem, Modern data stack, Data engineering, Analytics engineering, Snowflake, Google, AWS, Microsoft, Databricks</Skills>
      <Category>Marketing</Category>
      <Industry>Technology</Industry>
      <Employername>dbt Labs</Employername>
      <Employerlogo>https://logos.yubhub.co/getdbt.com.png</Employerlogo>
      <Employerdescription>dbt Labs is a leading analytics engineering platform, now used by over 90,000 teams every week, driving data transformations and AI use cases.</Employerdescription>
      <Employerwebsite>https://www.getdbt.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673163005</Applyto>
      <Location>US - Remote</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>ed072f2b-181</externalid>
      <Title>Staff Accountant</Title>
      <Description><![CDATA[<p>We are seeking a Staff Accountant to join our Accounting team. As a Staff Accountant, you will be responsible for operational and corporate accounting responsibilities, including journal entries, accruals, reconciliations, and monthly close activities. Our ideal candidate will have experience and a desire to work in a fast-paced, dynamic environment.</p>
<p>This is a hybrid (3x per week) opportunity out of our corporate office in downtown Austin, TX.</p>
<p>As a member of our Accounting team, you will work directly with the Accounting Manager and Sr. Accountants during the month-end close cycle, leverage your technical software skills to optimize accounting processes and support financial reporting, ensure department &amp; vertical alignment across financial reporting systems and other organizational software systems, provide fluctuation analysis and insights to Accounting leadership for forecasting and financial reporting, and assist the team in developing and maintaining timely and accurate financial statements and reports in accordance with generally accepted accounting principles (GAAP).</p>
<p>In addition, you will ensure financial records are in compliance with company policies and procedures, assist in the ongoing process of upskilling the accounting team processes and controls to introduce automation and technology tools, and meet with other company stakeholders to complete tasks as needed.</p>
<p>To be successful in this role, you will need to have 2-6 years of hands-on accounting experience, a Bachelor&#39;s degree in Accounting, Business or Finance, basic operational knowledge of U.S. GAAP, strong technical skills with proficiency in accounting ERP software (e.g., NetSuite, etc.), and other financial reporting tools, and excellent analytical and problem-solving skills.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>accounting, financial reporting, U.S. GAAP, NetSuite, ERP software, Excel, Google Docs, Microsoft Word, PowerPoint, RPA, automation software, Snowflake virtual data warehouse</Skills>
      <Category>Finance</Category>
      <Industry>Energy</Industry>
      <Employername>RigUp</Employername>
      <Employerlogo>https://logos.yubhub.co/rigup.com.png</Employerlogo>
      <Employerdescription>RigUp is a source-to-pay solution built for energy, empowering leading energy companies and their suppliers to work better, together.</Employerdescription>
      <Employerwebsite>https://rigup.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/rigup/jobs/7533482003</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0b97b97d-56b</externalid>
      <Title>Solutions Engineer (pre-sales)</Title>
      <Description><![CDATA[<p>About Mixpanel</p>
<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel&#39;s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>
<p>As a Solutions Engineer (pre-sales) at Mixpanel, you will partner closely with Account Executives, Account Managers, Product, Engineering, and Support to successfully roll out self-serve analytics within our customer&#39;s organizations, help the customer manage change, execute on technical projects and services that delight our customers and ultimately drive ROI on the customer&#39;s Mixpanel investment.</p>
<p>Responsibilities</p>
<ul>
<li>Support Sales Engineers and Account Executives in deal cycles, contributing to technical discovery and solution design</li>
<li>Deliver standard and semi-customized product demos to prospects</li>
<li>Assist in qualifying customer use cases and identifying opportunities where Mixpanel can provide value</li>
<li>Contribute to proof-of-concept projects, including setup, execution, and documentation of results</li>
<li>Provide guidance on implementation best practices, including instrumentation and data structure</li>
<li>Collaborate with internal teams (Sales, Product, Engineering, Support) to ensure a smooth customer experience</li>
<li>Build relationships with customer stakeholders and respond to technical questions</li>
<li>Capture and share customer feedback with internal teams to inform product improvements</li>
</ul>
<p>We&#39;re Looking For Someone Who Has</p>
<ul>
<li>Ability to communicate with both technical and non-technical stakeholders</li>
<li>Some experience supporting technical sales cycles, customer implementations, or consulting engagements</li>
<li>3+ years of experience in Sales Engineering, Customer Success, Solutions Consulting, or a related role</li>
<li>Working knowledge of data concepts such as SQL, event tracking, or analytics tools</li>
<li>Familiarity with databases or cloud data warehouses (e.g., Snowflake, BigQuery, Redshift)</li>
<li>Strong problem-solving skills with the ability to work on moderately complex, well-defined problems</li>
<li>Solid communication and presentation skills</li>
<li>Ability to manage multiple workstreams with guidance</li>
<li>Interest in learning and applying new technologies, including AI tools</li>
<li>Willingness to travel as needed</li>
</ul>
<p>Compensation</p>
<p>The amount listed below is the total target cash compensation (TTCC) and includes base compensation and variable compensation in the form of either a company bonus or commissions. Variable compensation type is determined by your role and level. In addition to the cash compensation provided, this position is also eligible for equity consideration and other benefits including medical, vision, and dental insurance coverage.</p>
<p>Our salary ranges are determined by role and level and are benchmarked to the SF Bay Area Technology data cut released by Radford, a global compensation database. The range displayed represents the minimum and maximum TTCC for new hire salaries for the position across all of our US locations. To stay on top of market conditions, we refresh our salary ranges twice a year so these ranges may change in the future. Within the range, individual pay is determined by experience, job-related skills, qualifications, and other factors.</p>
<p>Mixpanel Compensation Range $170,000-$230,000 USD</p>
<p>Benefits and Perks</p>
<ul>
<li>Comprehensive Medical, Vision, and Dental Care</li>
<li>Mental Wellness Benefit</li>
<li>Generous Vacation Policy &amp; Additional Company Holidays</li>
<li>Enhanced Parental Leave</li>
<li>Volunteer Time Off</li>
<li>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</li>
</ul>
<p>Culture Values</p>
<ul>
<li>Make Bold Bets: We choose courageous action over comfortable progress.</li>
<li>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</li>
<li>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</li>
<li>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</li>
<li>Champion the Customer: We seek to deeply understand our customers&#39; needs, ensuring their success is our north star.</li>
<li>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</li>
</ul>
<p>Why choose Mixpanel?</p>
<p>We&#39;re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital. Mixpanel&#39;s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics. Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>
<p>Choosing to work at Mixpanel means you&#39;ll be helping the world&#39;s most innovative companies learn from their data so they can make better decisions.</p>
<p>Mixpanel is an equal opportunity employer supporting workforce diversity. At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have. We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply. We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, veteran status, or disability status. Pursuant to the San Francisco Fair Chance Ordinance or other similar laws that may be applicable, we will consider for employment qualified applicants with arrest and conviction records.</p>
<p>We&#39;ve immersed ourselves in our Culture and Values as our guiding principles for the impact we want to have and the future</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$170,000-$230,000 USD</Salaryrange>
      <Skills>SQL, event tracking, analytics tools, databases, cloud data warehouses, Snowflake, BigQuery, Redshift</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a digital analytics platform that helps companies understand user behavior and track company success metrics.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7800289</Applyto>
      <Location>San Francisco, US (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7d922b91-9e7</externalid>
      <Title>Support Operations Analyst</Title>
      <Description><![CDATA[<p>As a Support Operations Analyst at Anthropic, you will build the analytical and workforce planning foundation that enables our support organisation to scale intelligently. This role sits at the intersection of data analysis, capacity planning, and operational strategy,providing the insights leadership needs to make confident decisions about staffing, investment, and service levels.</p>
<p>You&#39;ll own forecasting and capacity planning across our support organisation, including FTE teams, AI-powered support channels, and vendor/contractor partnerships. This means building models that predict volume based on product launches, model releases, and customer growth; analysing the relationship between support metrics and business outcomes; and ensuring we have the right resources in the right places to meet our service commitments.</p>
<p>Responsibilities:</p>
<p>Workforce Planning &amp; Forecasting</p>
<ul>
<li>Build and maintain staffing models that translate SLA targets into headcount requirements across FTE and vendor teams</li>
<li>Forecast support volume by analysing historical trends, product release calendars, model launches, and customer base growth projections</li>
<li>Factor AI support effectiveness (automation rates, deflection, Fin AI Agent performance) into capacity models to ensure accurate human staffing projections</li>
<li>Partner with vendor managers to align contractor capacity with demand forecasts and service level requirements</li>
<li>Model scenarios to inform strategic decisions about staffing investments, vendor mix, and coverage models</li>
<li>Develop frameworks for prioritising automation initiatives based on volume impact and deflection potential</li>
</ul>
<p>Analytics &amp; Reporting</p>
<ul>
<li>Maintain and enhance dashboards that track productivity, response times, CSAT, queue health, and other key support metrics</li>
<li>Investigate the relationship between support performance and business outcomes (e.g., how response time and satisfaction impact retention and churn)</li>
<li>Surface trends and insights that inform operational decisions,identifying what&#39;s driving volume, where bottlenecks emerge, and where investment is needed</li>
<li>Translate complex data into clear recommendations for leadership and cross-functional partners</li>
</ul>
<p>Operational Partnership</p>
<ul>
<li>Collaborate with Support Ops, AI Support, and Human Support teams to ensure data and forecasts align with operational reality</li>
<li>Partner with Finance on headcount planning, budget alignment, and quarterly capacity reviews</li>
<li>Work with Product and Engineering to anticipate how launches and feature changes will impact support demand</li>
<li>Contribute to vendor performance management by establishing metrics frameworks and reporting cadences</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$131,040-$165,000 USD</Salaryrange>
      <Skills>SQL, data warehouses, analysis tools, forecasting, capacity planning, workforce management, vendor management, Hex, Looker, BigQuery, Assembled, NICE, Calabrio</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a company that creates reliable, interpretable, and steerable AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5080931008</Applyto>
      <Location>San Francisco, CA | New York City, NY | Seattle, WA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>98161ddd-28c</externalid>
      <Title>Data Analyst III</Title>
      <Description><![CDATA[<p>Why join us</p>
<p>Brex is a finance platform that enables companies to spend smarter and move faster in over 200 markets. It combines global corporate cards and banking with intuitive spend management, bill pay, and travel software.</p>
<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>
<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>
<p>We&#39;re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>
<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>
<p>Data at Brex</p>
<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>
<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>
<p>What you&#39;ll do</p>
<p>As a senior Data Analyst (DA III), you will own the end-to-end analytics lifecycle for one or more business areas at Brex.</p>
<p>You&#39;ll go beyond building dashboards,you&#39;ll frame the right questions, design rigorous analyses, apply statistical methods, and translate your findings into clear recommendations for leadership.</p>
<p>You will also serve as a technical leader on the Data Analytics team, mentoring more junior analysts and helping define the standards and best practices that elevate the team&#39;s work.</p>
<p>This role sits at the intersection of analytics, analytics engineering, and business strategy.</p>
<p>You&#39;ll work in a modern data stack environment and partner closely with Data Scientists, Data Engineers, and senior leaders across the organization.</p>
<p>Where you&#39;ll work</p>
<p>This role will be based in our San Francisco office.</p>
<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>
<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>
<p>As a perk, we also have up to four weeks per year of fully remote work!</p>
<p>Responsibilities</p>
<ul>
<li>Own the analytics lifecycle for assigned business areas: from problem framing and data sourcing through analysis, insight generation, and stakeholder presentation.</li>
</ul>
<ul>
<li>Build and maintain dashboards and self-service reporting tools that enable business teams to independently track performance, identify risks, and make data-driven decisions.</li>
</ul>
<ul>
<li>Write production-quality SQL and Python code to extract, transform, and analyze data at scale.</li>
</ul>
<ul>
<li>Collaborate with Data Engineers and Data Scientists to develop and maintain analytical data models, improve data pipelines, and ensure data quality across the organization.</li>
</ul>
<ul>
<li>Partner with leadership across Sales, Operations, Product, Finance, and other departments to identify high-impact analytical opportunities and deliver actionable recommendations.</li>
</ul>
<ul>
<li>Mentor other data analysts and contribute to the development of team standards, documentation, code review practices, and analytical frameworks.</li>
</ul>
<ul>
<li>Proactively identify gaps in data infrastructure, propose improvements, and contribute to the evolution of the team’s tooling and processes.</li>
</ul>
<p>Requirements</p>
<ul>
<li>5+ years of experience in data analytics, business intelligence, or a related quantitative role.</li>
</ul>
<ul>
<li>3+ years of experience partnering directly with Sales, Operations, Product, or equivalent business teams as an embedded analytics partner.</li>
</ul>
<ul>
<li>Advanced SQL proficiency, including CTEs, window functions, performance optimization, and working across complex data models.</li>
</ul>
<ul>
<li>Proficiency in Python for data analysis, automation, and modeling (Pandas, NumPy, scikit-learn, or similar).</li>
</ul>
<ul>
<li>Experience with cloud data warehouses, particularly Snowflake (BigQuery and Databricks also valued).</li>
</ul>
<ul>
<li>Hands-on experience with BI and data visualization tools (Looker, Tableau, Hex, or similar).</li>
</ul>
<ul>
<li>Strong stakeholder management skills,proven ability to present complex technical findings to non-technical audiences.</li>
</ul>
<ul>
<li>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</li>
</ul>
<p>Bonus points</p>
<ul>
<li>Demonstrated experience applying statistical methods to business problems (e.g., regression, classification, A/B testing).</li>
</ul>
<ul>
<li>Experience with dbt for data modeling and transformation.</li>
</ul>
<ul>
<li>Experience building and maintaining data pipelines using orchestration tools such as Airflow.</li>
</ul>
<ul>
<li>Experience working with APIs for data ingestion and integration.</li>
</ul>
<ul>
<li>Familiarity with version control systems (Git).</li>
</ul>
<ul>
<li>Experience in fintech, financial services, or payments.</li>
</ul>
<ul>
<li>Track record of leading cross-functional analytics projects from scoping through delivery.</li>
</ul>
<p>Compensation</p>
<p>The expected salary range for this role is $114,192 - $142,740.</p>
<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>
<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$114,192 - $142,740</Salaryrange>
      <Skills>Advanced SQL, Python, Cloud data warehouses, BI and data visualization tools, Stakeholder management, Generative AI and LLM-based tools, Statistical methods, dbt for data modeling and transformation, Orchestration tools, APIs for data ingestion and integration, Version control systems, Fintech, financial services, or payments</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is a finance platform that enables companies to spend smarter and move faster in over 200 markets. It combines global corporate cards and banking with intuitive spend management, bill pay, and travel software.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8463699002</Applyto>
      <Location>San Francisco, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>d301c7b8-b54</externalid>
      <Title>Manager, Solutions Engineering</Title>
      <Description><![CDATA[<p>About Mixpanel</p>
<p>Mixpanel turns data clarity into innovation. Its AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>
<p>As a Manager on the Solutions Engineering team at Mixpanel, you will lead a talented group of analytics consultants who are pivotal to our success. You will be at the forefront of driving customer value, guiding your team as they serve as the primary technical resources for our Sales organisation.</p>
<p>Responsibilities</p>
<ul>
<li>Develop &amp; Mentor: Lead, coach, and grow a high-performing and inclusive team of Solutions Engineers, actively investing in their career development and upholding a high standard of performance.</li>
</ul>
<ul>
<li>Drive Results: Partner closely with Sales leadership and Account Executives to provide technical expertise that drives new, retained, and expansion ARR. You will ensure your team&#39;s activities are directly contributing to the company&#39;s bottom line.</li>
</ul>
<ul>
<li>Prioritise &amp; Problem Solve: Guide your team through complex customer evaluations and technical challenges. You will manage team resources effectively, aligning the right skills to customer needs to achieve productivity targets and successful outcomes.</li>
</ul>
<ul>
<li>Cross-Functional Partnership: Act as a key technical liaison, collaborating with peer managers across Sales, Product, and Engineering. You will gather and synthesise customer feedback from your team to influence product strategy and solve problems at scale.</li>
</ul>
<ul>
<li>Communicate &amp; Manage Change: Effectively translate broader company and departmental strategy into clear, actionable goals for your team. You will guide your direct reports through evolving business priorities with empathy and clarity.</li>
</ul>
<ul>
<li>Hire the Best: Actively assess the needs of the team, build a pipeline of top talent, and hire outstanding individuals who elevate the team&#39;s capabilities and contribute to our inclusive culture.</li>
</ul>
<ul>
<li>Innovate &amp; Raise the Bar: Relentlessly seek to improve how your team operates, from refining demo strategies and proof-of-concept methodologies to adopting new tools and processes that increase effectiveness and celebrate success.</li>
</ul>
<p>We&#39;re Looking For Someone Who</p>
<ul>
<li>Has progressive experience in a B2B SaaS environment, including 3+ years of people management experience leading a technical pre-sales, solutions engineering, or professional services team.</li>
</ul>
<ul>
<li>Exhibits a &#39;player-coach&#39; mentality with deep knowledge in the data and analytics space. You are an expert on how data products (like CDPs, data warehouses, and analytics tools) are implemented and adopted by customers.</li>
</ul>
<ul>
<li>Is a proven cross-functional partner with a track record of successfully working with sales teams to navigate complex deals and drive revenue.</li>
</ul>
<ul>
<li>Demonstrates expertise in communicating complex technical concepts clearly and effectively to both technical and non-technical stakeholders.</li>
</ul>
<ul>
<li>Is skilled at prioritising team activities and managing workload in a dynamic environment, balancing customer needs with efficiency goals.</li>
</ul>
<ul>
<li>Is a natural mentor and developer of talent, with a passion for coaching and a history of building inclusive, high-achieving teams.</li>
</ul>
<ul>
<li>Handles ambiguity with ease, demonstrating flexibility and a proactive, problem-solving mindset when adapting to new challenges and business priorities.</li>
</ul>
<ul>
<li>Actively seeks feedback and is humble to learn, consistently looking for ways to improve themselves and their team.</li>
</ul>
<p>Bonus Points</p>
<ul>
<li>Previous experience in management consulting, strategic operations, or a similar role focused on go-to-market strategy.</li>
</ul>
<ul>
<li>Direct, hands-on experience with Mixpanel or other product analytics tools like Amplitude, Pendo, or Contentsquare.</li>
</ul>
<ul>
<li>Strong familiarity with the modern data stack, including tools like Snowflake, Google BigQuery, Segment, or Hightouch.</li>
</ul>
<p>Compensation</p>
<p>The amount listed below is the total target cash compensation (TTCC) and includes base compensation and variable compensation in the form of either a company bonus or commissions. Variable compensation type is determined by your role and level. In addition to the cash compensation provided, this position is also eligible for equity consideration and other benefits including medical, vision, and dental insurance coverage.</p>
<p>Our salary ranges are determined by role and level and are benchmarked to the SF Bay Area Technology data cut released by Radford, a global compensation database. The range displayed represents the minimum and maximum TTCC for new hire salaries for the position across all of our US locations. To stay on top of market conditions, we refresh our salary ranges twice a year so these ranges may change in the future. Within the range, individual pay is determined by experience, job-related skills, qualifications, and other factors.</p>
<p>If you have questions about the specific range, your recruiter can share this information.</p>
<p>Mixpanel Compensation Range: $238,300-$321,705 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$238,300-$321,705 USD</Salaryrange>
      <Skills>product analytics, data and analytics, data products, CDPs, data warehouses, analytics tools, Mixpanel, Amplitude, Pendo, Contentsquare, Snowflake, Google BigQuery, Segment, Hightouch</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a leader in analytics with over 29,000 companies using its platform, including Workday, Pinterest, LG, and Rakuten Viber.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7746430</Applyto>
      <Location>New York City, US (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>760c3e88-e35</externalid>
      <Title>Senior Product Manager, Data</Title>
      <Description><![CDATA[<p>Job Title: Senior Product Manager, Data</p>
<p>We are seeking a Senior Product Manager to support the development of CoreWeave&#39;s Enterprise Data Platform within the CIO organization. This role will contribute to building a scalable, high-performance data lake and data architecture, integrating data from key sources across Operations, Engineering, Sales, Finance, and other IT partners.</p>
<p>As a Senior Product Manager for Data Infrastructure and Analytics, you will help drive data ingestion, transformation, governance, and analytics enablement. You will collaborate with engineering, analytics, finance, and business teams to help deliver data lake and pipeline orchestration solutions, ensuring accessible data for business insights.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Own and evangelize Data Platform and Business Analytics roadmap and strategy across CoreWeave</li>
<li>Assist with the execution of CoreWeave&#39;s enterprise data architecture, helping enable the data lake and domain-driven data layer</li>
<li>Support the development and enhancement of data ingestion, transformation, and orchestration pipelines for scalability, efficiency, and reliability</li>
<li>Work with the Engineering and Data teams to maintain and enhance data pipelines for both structured and unstructured data, enabling efficient data movement across the organization</li>
<li>Collaborate with Finance, GTM, Infrastructure, Data Center, and Supply Chain teams to help unify and model data from core systems (ERP, CRM, Asset Mgmt, Supply Chain systems, etc.)</li>
<li>Contribute to data governance and quality initiatives, focusing on data consistency, lineage tracking, and compliance with security standards</li>
<li>Support the BI and analytics layer by partnering with stakeholders to enable data products, dashboards, and reporting capabilities</li>
<li>Help prioritize data-driven initiatives, ensuring alignment with business goals and operational needs in coordination with leadership</li>
</ul>
<p>Requirements:</p>
<ul>
<li>5+ years of experience in data product management, data architecture, or enterprise data engineering roles</li>
<li>Familiarity with data lakes, data warehouses, ETL/ELT and streaming pipelines, and data governance frameworks</li>
<li>Hands-on experience with modern data stack technologies (such as Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka)</li>
<li>Understanding of data modeling, domain-driven design, and creating scalable data platforms</li>
<li>Experience supporting the end-to-end data product lifecycle, including requirements gathering and implementation</li>
<li>Strong collaboration skills with engineering, analytics, and business teams to help deliver data initiatives</li>
<li>Awareness of data security, compliance, and governance best practices</li>
<li>Understanding of BI and analytics platforms (such as Tableau, Looker, Power BI) and supporting self-service analytics</li>
</ul>
<p>Why CoreWeave?</p>
<p>At CoreWeave, we work hard, have fun, and move fast! We&#39;re in an exciting stage of hyper-growth that you will not want to miss out on. We&#39;re not afraid of a little chaos, and we&#39;re constantly learning. Our team cares deeply about how we build our product and how we work together, which is represented through our core values:</p>
<ul>
<li>Be Curious at Your Core</li>
<li>Act Like an Owner</li>
<li>Empower Employees</li>
<li>Deliver Best-in-Class Client Experiences</li>
<li>Achieve More Together</li>
</ul>
<p>We support and encourage an entrepreneurial outlook and independent thinking. We foster an environment that encourages collaboration and provides the opportunity to develop innovative solutions to complex problems. As we get set for take off, the growth opportunities within the organization are constantly expanding. You will be surrounded by some of the best talent in the industry, who will want to learn from you, too. Come join us!</p>
<p>Salary Range: $143,000 to $210,000</p>
<p>Benefits:</p>
<ul>
<li>Medical, dental, and vision insurance - 100% paid for by CoreWeave</li>
<li>Company-paid Life Insurance</li>
<li>Voluntary supplemental life insurance</li>
<li>Short and long-term disability insurance</li>
<li>Flexible Spending Account</li>
<li>Health Savings Account</li>
<li>Tuition Reimbursement</li>
<li>Ability to Participate in Employee Stock Purchase Program (ESPP)</li>
<li>Mental Wellness Benefits through Spring Health</li>
<li>Family-Forming support provided by Carrot</li>
<li>Paid Parental Leave</li>
<li>Flexible, full-service childcare support with Kinside</li>
<li>401(k) with a generous employer match</li>
<li>Flexible PTO</li>
<li>Catered lunch each day in our office and data center locations</li>
<li>A casual work environment</li>
<li>A work culture focused on innovative disruption</li>
</ul>
<p>Workplace:</p>
<p>While we prioritize a hybrid work environment, remote work may be considered for candidates located more than 30 miles from an office, based on role requirements for specialized skill sets. New hires will be invited to attend onboarding at one of our hubs within their first month. Teams also gather quarterly to support collaboration.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$143,000 to $210,000</Salaryrange>
      <Skills>data product management, data architecture, enterprise data engineering, data lakes, data warehouses, ETL/ELT and streaming pipelines, data governance frameworks, modern data stack technologies, Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka, data modeling, domain-driven design, scalable data platforms, BI and analytics platforms, Tableau, Looker, Power BI</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>CoreWeave</Employername>
      <Employerlogo>https://logos.yubhub.co/coreweave.com.png</Employerlogo>
      <Employerdescription>CoreWeave is a cloud-based platform that enables innovators to build and scale AI with confidence.</Employerdescription>
      <Employerwebsite>https://www.coreweave.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/coreweave/jobs/4649824006</Applyto>
      <Location>Livingston, NJ / New York, NY / Sunnyvale, CA / Bellevue, WA/San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6e6544bc-9bc</externalid>
      <Title>Staff Machine Learning Engineer, Listings and Host Tools Data and AI</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Staff Machine Learning Engineer to join our Listings and Host Tools Data and AI team. As a member of this team, you will support host personalization products and provide data-driven solutions to achieve a superior host experience on Airbnb.</p>
<p>The Listings and Host Tools Data and AI team owns data pipelines and ML models and builds services for serving that are used in the above areas. We leverage open source, third-party, and homegrown ML models to improve the Host and Guest experience.</p>
<p>As an ML engineer, you will partner closely with our data science, product partners, and other ML + data engineers on the team to execute on these opportunities in order to improve the Host and Guest product experience on Airbnb.</p>
<p>Your responsibilities will include:</p>
<ul>
<li>Working with large-scale structured and unstructured data to build and continuously improve cutting-edge Machine Learning models for Airbnb product, business, and operational use cases.</li>
</ul>
<ul>
<li>Collaborating with cross-functional partners, including software engineers, product managers, operations, and data scientists, to identify opportunities for business impact, understand, refine, and prioritize requirements for machine learning models, drive engineering decisions, and quantify impact.</li>
</ul>
<ul>
<li>Prototyping machine learning use cases for use in the product and working with stakeholders to iterate on requirements.</li>
</ul>
<ul>
<li>Developing, productionizing, and operating Machine Learning models and pipelines at scale, including both batch and real-time use cases.</li>
</ul>
<ul>
<li>Designing and building services and APIs to enable serving ML model-driven data to product use cases.</li>
</ul>
<p>We&#39;re looking for someone with 8+ years of industry experience in applied Machine Learning, including a Master&#39;s or Ph.D. in a relevant field. You should have experience in both Natural Language Processing and Computer Vision, as well as strong programming and data engineering skills.</p>
<p>You should also have a deep understanding of Machine Learning best practices, algorithms, and domains, as well as experience with technologies such as TensorFlow, PyTorch, Kubernetes, Spark, Airflow, and data warehouses.</p>
<p>If you&#39;re passionate about building end-to-end Machine Learning infrastructure and productionizing Machine Learning models, we&#39;d love to hear from you!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$204,000-$255,000 USD</Salaryrange>
      <Skills>Machine Learning, Natural Language Processing, Computer Vision, Programming, Data Engineering, TensorFlow, PyTorch, Kubernetes, Spark, Airflow, Data Warehouses</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Airbnb</Employername>
      <Employerlogo>https://logos.yubhub.co/airbnb.com.png</Employerlogo>
      <Employerdescription>Airbnb is a global online marketplace for short-term vacation rentals. It was founded in 2007 and has since grown to become one of the largest online marketplaces for unique stays and experiences.</Employerdescription>
      <Employerwebsite>https://www.airbnb.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/airbnb/jobs/7454348</Applyto>
      <Location>Remote-USA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>45f997b8-a5c</externalid>
      <Title>Salesforce Administrator</Title>
      <Description><![CDATA[<p>As a Salesforce Administrator at Anthropic, you will play a key role in building and maintaining world-class CRM systems for our GTM team. In this role, you&#39;ll work both autonomously and collaboratively to design and implement tactical and project-related deliverables that support our business growth. You&#39;ll work closely with other Revenue Operations team members and business partners to understand requirements and drive strategic solutions, always with a focus on flexibility, maintainability, and scale.</p>
<p>Responsibilities:</p>
<p>Configure and customize Salesforce to meet business needs Optimize QTC solutions with Salesforce and other systems (Ironclad, Stripe and Metronome) Design and develop high quality solutions in an agile environment including partner management, forecasting, territory mapping, internationalization, pipeline optimization, and lead management processes Work closely with operations and business partners to understand and refine requirements, objectives, and processes Build for scale by designing holistically, with a focus on flexibility and maintainability Ensure appropriate controls and documentation are followed to create an effective control environment Deal with ambiguity in a rapidly changing business environment, resolve problems and offer impacting solutions Configure and maintain other business systems like Hubspot, Apollo, and Clay Build and optimize integrations between Salesforce and other platforms</p>
<p>You may be a good fit if you:</p>
<p>Have 3+ years of Salesforce admin experience Have experience implementing and maintaining QTC/CPQ solutions Hold Salesforce Basic and Advanced Certifications (Advanced Administrator certification, Advanced Developer or Platform II certification) Have in-depth understanding of the capabilities and constraints of the Salesforce CRM application architecture Have experience with data migration and integrating Salesforce with other platforms/services Have worked with financial systems such as Stripe and Netsuite (Metronome a plus) Have basic SQL skills and experience working with data warehouses Have proven experience scaling Salesforce implementations in high-growth environments Have experience working closely with and implementing solutions for a fast-growing GTM organization.</p>
<p>Possess excellent analytical skills, combined with impeccable business judgment Can communicate effectively with management, sales, marketing, vendors and international teams</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$190,000-$270,000 USD</Salaryrange>
      <Skills>Salesforce, QTC, Ironclad, Stripe, Metronome, Hubspot, Apollo, Clay, data migration, SQL, data warehouses</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic creates reliable, interpretable, and steerable AI systems. It is a public benefit corporation headquartered in San Francisco.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5075070008</Applyto>
      <Location>San Francisco, CA | New York City, NY | Seattle, WA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>e00b7052-70b</externalid>
      <Title>Senior Business Systems Analyst, Finance Systems</Title>
      <Description><![CDATA[<p>We are seeking an experienced Senior Business Systems Analyst to join our Finance Systems team at Anthropic. In this role, you will serve as the internal functional lead for our Workday Financials implementation, owning the design and configuration of the Financial Data Model (FDM), Chart of Accounts, and dimensional structures that will serve as the source of truth for financial reporting.</p>
<p>You will develop Prism Analytics and Accounting Center solutions, gather requirements and build reporting capabilities, and collaborate closely with cross-functional teams to drive the successful adoption of our new ERP platform.</p>
<p>This is a critical role that will directly shape how Anthropic&#39;s finance organisation operates as we scale toward public company readiness. You will work at the intersection of finance domain expertise and technical implementation, partnering with the implementation partner, engineering teams, and finance stakeholders to build a world-class financial systems foundation.</p>
<p>Responsibilities:</p>
<ul>
<li>ERP Core Financials Implementation: Serve as internal functional lead for Workday Financials implementation, partnering with consultants to drive configuration decisions, validate designs, and ensure business requirements are met</li>
</ul>
<ul>
<li>Financial Data Model (FDM) Design: Own the design and configuration of Chart of Accounts, Worktags, dimensional hierarchies, and Accounting Books that will serve as the source of truth for all financial reporting, ensuring support for both GAAP and Management reporting requirements</li>
</ul>
<ul>
<li>Prism Analytics Development: Develop and maintain Prism/Accounting Center solutions from source analysis and ingestion design through build, testing, cutover, and hypercare, including integration with external data sources like BigQuery and Pigment</li>
</ul>
<ul>
<li>Requirements Gathering &amp; Reporting: Gather business requirements from Finance, Accounting, and FP&amp;A stakeholders, translating them into hands-on development of executive reporting, dashboards, and analytics solutions</li>
</ul>
<ul>
<li>Workshop Participation &amp; Solution Design: Participate in implementation workshops, challenge requirements, and translate business needs into buildable designs and testable acceptance criteria; manage defects and data quality issues throughout the project lifecycle</li>
</ul>
<ul>
<li>Cross-Functional Collaboration: Collaborate with Integrations, Security, and Financials configuration teams to align master data, journals, controls, and performance service level agreements; partner with Data Infrastructure and BizTech teams on system integrations</li>
</ul>
<ul>
<li>Cutover &amp; Hypercare Planning: Prepare cutover plans, data migration strategies, reconciliation frameworks, and hypercare plans; document data lineage, controls, and audit artifacts to support SOX compliance requirements</li>
</ul>
<ul>
<li>Platform Expansion &amp; Adoption: Work closely with engineering teams and business stakeholders to drive ongoing expansion and adoption of the Workday platform, identifying opportunities for process improvement and automation</li>
</ul>
<p>You may be a good fit if you:</p>
<ul>
<li>Have 8+ years of experience in finance systems, ERP implementation, or business systems analysis roles, with at least 5 years of hands-on Workday Financials experience</li>
</ul>
<ul>
<li>Possess deep expertise in Workday Financial Data Model (FDM), including Chart of Accounts design, Worktags configuration, dimensional hierarchies, and Accounting Books setup</li>
</ul>
<ul>
<li>Have strong experience with Workday Prism Analytics, including data modeling, source integration, calculated fields, and report development</li>
</ul>
<ul>
<li>Are skilled at translating complex business requirements into technical solutions, bridging the gap between finance stakeholders and technical implementation teams</li>
</ul>
<ul>
<li>Have experience with full ERP implementation lifecycles, including requirements gathering, configuration, testing, data migration, cutover planning, and hypercare</li>
</ul>
<ul>
<li>Possess strong understanding of financial accounting processes including General Ledger, multi-entity consolidation, intercompany accounting, and management reporting</li>
</ul>
<ul>
<li>Have excellent stakeholder management and communication skills, with ability to work effectively with finance leadership, accounting teams, and technical partners</li>
</ul>
<ul>
<li>Demonstrate strong analytical and problem-solving skills with attention to detail and commitment to data accuracy and integrity</li>
</ul>
<ul>
<li>Are comfortable working in fast-paced, high-growth environments with evolving requirements and tight timelines</li>
</ul>
<p>Strong candidates may also have:</p>
<ul>
<li>Background in accounting, finance, or CPA certification with understanding of GAAP/IFRS reporting requirements</li>
</ul>
<ul>
<li>Experience with Workday Accounting Center for complex journal automation and subledger accounting</li>
</ul>
<ul>
<li>Technical proficiency with SQL, Python, or scripting languages for data analysis and integration support</li>
</ul>
<ul>
<li>Experience integrating Workday with external data platforms such as BigQuery or cloud data warehouses</li>
</ul>
<ul>
<li>Knowledge of SOX compliance requirements and internal controls for financial systems</li>
</ul>
<ul>
<li>Experience with EPM/FP&amp;A systems such as Pigment, Anaplan, or Adaptive Planning and their integration with ERP</li>
</ul>
<ul>
<li>Prior experience at high-growth technology companies scaling toward IPO readiness</li>
</ul>
<ul>
<li>Familiarity with Workday HCM and understanding of HCM-Financials integration points</li>
</ul>
<ul>
<li>Experience with data migration tools, ETL processes, and reconciliation frameworks for ERP implementations</li>
</ul>
<p>The annual compensation range for this role is $205,000-$265,000 USD.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$205,000-$265,000 USD</Salaryrange>
      <Skills>Workday Financials, Workday Financial Data Model (FDM), Chart of Accounts design, Worktags configuration, Dimensional hierarchies, Accounting Books setup, Prism Analytics, Data modeling, Source integration, Calculated fields, Report development, ERP implementation lifecycles, Requirements gathering, Configuration, Testing, Data migration, Cutover planning, Hypercare, Financial accounting processes, General Ledger, Multi-entity consolidation, Intercompany accounting, Management reporting, Stakeholder management, Communication skills, Analytical skills, Problem-solving skills, Data accuracy and integrity, SQL, Python, Scripting languages, BigQuery, Cloud data warehouses, SOX compliance requirements, Internal controls, EPM/FP&amp;A systems, Pigment, Anaplan, Adaptive Planning, ERP integration, High-growth technology companies, IPO readiness, Workday HCM, HCM-Financials integration points, Data migration tools, ETL processes, Reconciliation frameworks</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.co.png</Employerlogo>
      <Employerdescription>Anthropic is a technology company focused on creating reliable, interpretable, and steerable AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.co/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/4991194008</Applyto>
      <Location>San Francisco, CA | Seattle, WA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7a6a5e65-740</externalid>
      <Title>Data Analyst III</Title>
      <Description><![CDATA[<p>Why join us</p>
<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>
<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>
<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream. We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>
<p>Data at Brex</p>
<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations. Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>
<p>What you’ll do</p>
<p>As a senior Data Analyst (DA III), you will own the end-to-end analytics lifecycle for one or more business areas at Brex. You’ll go beyond building dashboards,you’ll frame the right questions, design rigorous analyses, apply statistical methods, and translate your findings into clear recommendations for leadership. You will also serve as a technical leader on the Data Analytics team, mentoring more junior analysts and helping define the standards and best practices that elevate the team’s work.</p>
<p>This role sits at the intersection of analytics, analytics engineering, and business strategy. You’ll work in a modern data stack environment and partner closely with Data Scientists, Data Engineers, and senior leaders across the organization.</p>
<p>Where you’ll work</p>
<p>This role will be based in our New York office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>
<p>Responsibilities</p>
<ul>
<li>Own the analytics lifecycle for assigned business areas: from problem framing and data sourcing through analysis, insight generation, and stakeholder presentation.</li>
</ul>
<ul>
<li>Build and maintain dashboards and self-service reporting tools that enable business teams to independently track performance, identify risks, and make data-driven decisions.</li>
</ul>
<ul>
<li>Write production-quality SQL and Python code to extract, transform, and analyze data at scale.</li>
</ul>
<ul>
<li>Collaborate with Data Engineers and Data Scientists to develop and maintain analytical data models, improve data pipelines, and ensure data quality across the organization.</li>
</ul>
<ul>
<li>Partner with leadership across Sales, Operations, Product, Finance, and other departments to identify high-impact analytical opportunities and deliver actionable recommendations.</li>
</ul>
<ul>
<li>Mentor other data analysts and contribute to the development of team standards, documentation, code review practices, and analytical frameworks.</li>
</ul>
<ul>
<li>Proactively identify gaps in data infrastructure, propose improvements, and contribute to the evolution of the team’s tooling and processes.</li>
</ul>
<p>Requirements</p>
<ul>
<li>5+ years of experience in data analytics, business intelligence, or a related quantitative role.</li>
</ul>
<ul>
<li>3+ years of experience partnering directly with Sales, Operations, Product, or equivalent business teams as an embedded analytics partner.</li>
</ul>
<ul>
<li>Advanced SQL proficiency, including CTEs, window functions, performance optimization, and working across complex data models.</li>
</ul>
<ul>
<li>Proficiency in Python for data analysis, automation, and modeling (Pandas, NumPy, scikit-learn, or similar).</li>
</ul>
<ul>
<li>Experience with cloud data warehouses, particularly Snowflake (BigQuery and Databricks also valued).</li>
</ul>
<ul>
<li>Hands-on experience with BI and data visualization tools (Looker, Tableau, Hex, or similar).</li>
</ul>
<ul>
<li>Strong stakeholder management skills,proven ability to present complex technical findings to non-technical audiences.</li>
</ul>
<ul>
<li>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</li>
</ul>
<p>Bonus points</p>
<ul>
<li>Demonstrated experience applying statistical methods to business problems (e.g., regression, classification, A/B testing).</li>
</ul>
<ul>
<li>Experience with dbt for data modeling and transformation.</li>
</ul>
<ul>
<li>Experience building and maintaining data pipelines using orchestration tools such as Airflow.</li>
</ul>
<ul>
<li>Experience working with APIs for data ingestion and integration.</li>
</ul>
<ul>
<li>Familiarity with version control systems (Git).</li>
</ul>
<ul>
<li>Experience in fintech, financial services, or payments.</li>
</ul>
<ul>
<li>Track record of leading cross-functional analytics projects from scoping through delivery.</li>
</ul>
<p>Compensation</p>
<p>The expected salary range for this role is $114,192 - $142,740. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$114,192 - $142,740</Salaryrange>
      <Skills>SQL, Python, Cloud data warehouses, BI and data visualization tools, Stakeholder management, Generative AI and LLM-based tools, Statistical methods, dbt for data modeling and transformation, Orchestration tools, APIs for data ingestion and integration, Version control systems, Fintech, financial services, or payments</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is a finance platform that enables companies to spend smarter and move faster in over 200 markets. It combines global corporate cards and banking with intuitive spend management, bill pay, and travel software.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8463704002</Applyto>
      <Location>New York, New York, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>25a942db-90e</externalid>
      <Title>Senior Data Analyst (Auth0)</Title>
      <Description><![CDATA[<p>Secure Every Identity, from AI to Human</p>
<p>Identity is the key to unlocking the potential of AI. Okta secures AI by building the trusted, neutral infrastructure that enables organisations to safely embrace this new era. This work requires a relentless drive to solve complex challenges with real-world stakes. We are looking for builders and owners who operate with speed and urgency and execute with excellence.</p>
<p>This is an opportunity to do career-defining work. We&#39;re all in on this mission. If you are too, let&#39;s talk.</p>
<p><strong>What is the Developer Led Growth (DLG) Team?</strong></p>
<p>At the heart of our mission is a simple goal: to make choosing and integrating authentication a delightful experience for every developer.</p>
<p>We’re the DLG team, essentially Product Led Growth (PLG), but for developers using Auth0. We are a high-impact, multidisciplinary engine that includes Developer Relations, Developer Marketing, our Startup Program, Product Activation and Conversion. We don&#39;t just move metrics, we build the journey that turns curious developers into lifelong advocates.</p>
<p><strong>Why This Role?</strong></p>
<ul>
<li>Great Visibility: You will provide the analytical backbone for the entire DLG organisation, turning cross-functional data into a unified narrative.</li>
</ul>
<ul>
<li>Autonomy: We trust you to own your domain. This isn&#39;t a role for a cog in the machine, you’ll have the freedom to identify opportunities and drive the strategy.</li>
</ul>
<ul>
<li>True Variety: From analysing startup ecosystem trends to optimising conversion funnels, no two weeks will look the same.</li>
</ul>
<ul>
<li>Deep Collaboration: You’ll sit at the intersection of product, marketing, and community, working with stakeholders across the entire company to fuel our growth.</li>
</ul>
<p>If you’re a curious analyst who thrives in a fast-paced, developer first environment and wants to see their work directly influence how the world’s engineers build, we’d love to meet you.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Build and maintain Tableau dashboards and reports related to the DLG business</li>
</ul>
<ul>
<li>Build and maintain DBT models and custom SQL queries to analyse DLG business</li>
</ul>
<ul>
<li>Investigate and understand discrepancies in the data</li>
</ul>
<ul>
<li>Help DLG automate and scale processes, find areas of opportunity for process improvements.</li>
</ul>
<ul>
<li>Analyse A/B experiments and DLG initiatives.</li>
</ul>
<ul>
<li>Collaborate with sales, ops, marketing, finance and product to understand their challenges, dive into the data and share insights that solve business problems.</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>+5 years of experience working in data analytics-related fields.</li>
</ul>
<ul>
<li>Bachelor’s degree in math, statistics, computer science, economics or relevant field.</li>
</ul>
<ul>
<li>Strong proficiency in SQL and data modelling: cleaning, modelling and transforming large and complex datasets into usable and trusted models.</li>
</ul>
<ul>
<li>Familiarity with DBT (data build tool), Git and Snowflake data warehouse is preferred.</li>
</ul>
<ul>
<li>Working knowledge of visualisation tools, ideally Tableau, and Google Sheets.</li>
</ul>
<ul>
<li>Ability to analyse data discrepancies and troubleshoot data issues.</li>
</ul>
<ul>
<li>Excellent problem-solving skills with the ability to think critically and translate complex data into actionable insights.</li>
</ul>
<ul>
<li>Familiar with AI tooling, e.g. Copilot, Clay.</li>
</ul>
<ul>
<li>Excellent verbal and written communication skills, with the ability to collaborate with cross-functional teams including technical teams (data engineering) and business users (marketing, product, revenue, etc).</li>
</ul>
<ul>
<li>Stakeholder management and prioritisation skills</li>
</ul>
<p>Nice to have</p>
<ul>
<li>PLG experience</li>
</ul>
<ul>
<li>Experience building and working with AI Agents</li>
</ul>
<ul>
<li>Translating complex developer behaviour into product insights</li>
</ul>
<ul>
<li>AB testing experimentation experience</li>
</ul>
<ul>
<li>Revenue modelling and forecasting knowledge</li>
</ul>
<p>The annual base salary range for this position for candidates located in California (excluding San Francisco Bay Area), Colorado, Illinois, New York, and Washington is between: $114,000-$156,200 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$114,000-$156,200 USD</Salaryrange>
      <Skills>SQL, DBT, Git, Snowflake data warehouse, Tableau, Google Sheets, AI tooling, Copilot, Clay, PLG experience, Experience building and working with AI Agents, Translating complex developer behaviour into product insights, AB testing experimentation experience, Revenue modelling and forecasting knowledge</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Okta</Employername>
      <Employerlogo>https://logos.yubhub.co/okta.com.png</Employerlogo>
      <Employerdescription>Okta is a company that provides identity and access management solutions.</Employerdescription>
      <Employerwebsite>https://www.okta.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/okta/jobs/7683013</Applyto>
      <Location>Chicago, Illinois; New York, New York; Washington, DC</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>e500a909-0d2</externalid>
      <Title>Senior Data Analyst</Title>
      <Description><![CDATA[<p>As a Senior Data Analyst, you will play a crucial role in our data-driven decision-making process. You will be responsible for turning raw data into actionable insights that will shape our product strategy and drive business growth.</p>
<p>This role requires a deep understanding of product analytics, a strong technical skillset, and a forward-thinking mindset to leverage AI in your analysis.</p>
<p>Responsibilities:</p>
<ul>
<li>Design and build insightful and user-friendly dashboards and reports in Tableau to track key product metrics and performance indicators.</li>
<li>Utilize Snowflake and dbt to build and maintain robust and scalable data models and pipelines for our analytics needs.</li>
<li>Conduct in-depth product analysis to identify trends, patterns, and opportunities for product improvement and growth.</li>
<li>Write complex SQL queries and use Python to perform advanced data analysis.</li>
<li>Collaborate with product managers, engineers, and other stakeholders to understand their data needs and provide them with the insights they need to make informed decisions.</li>
<li>Proactively identify and explore new ways to leverage AI and machine learning to enhance our analytical capabilities and unlock new insights from our data.</li>
<li>Communicate your findings and recommendations effectively to both technical and non-technical audiences.</li>
</ul>
<p>Qualifications:</p>
<ul>
<li>Proven experience as a Data Analyst or in a similar role, with a focus on product analytics.</li>
<li>3-5 years of experience.</li>
<li>Expert-level proficiency in SQL.</li>
<li>Strong experience with data visualization tools, particularly Tableau.</li>
<li>Hands-on experience with cloud data warehouses like Snowflake and data transformation tools like dbt.</li>
<li>Advanced analytics, data science, AI/ML experience and techniques are a plus.</li>
<li>A strong analytical mindset with the ability to think critically and solve complex problems.</li>
<li>A proactive and curious mindset with a strong desire to learn and experiment with new technologies, especially in the realm of AI.</li>
<li>Excellent communication and collaboration skills.</li>
</ul>
<p>Added Advantage:</p>
<ul>
<li>Experience with other BI tools like Looker.</li>
<li>Experience with statistical analysis and machine learning.</li>
<li>Familiarity with product analytics platforms like Amplitude or Mixpanel</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Tableau, Snowflake, dbt, Python, Data Analysis, Data Visualization, Cloud Data Warehouses, Data Transformation, Advanced Analytics, Data Science, AI/ML, Statistical Analysis, Machine Learning, Product Analytics Platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Okta</Employername>
      <Employerlogo>https://logos.yubhub.co/okta.com.png</Employerlogo>
      <Employerdescription>Okta is a software company that provides identity and access management solutions.</Employerdescription>
      <Employerwebsite>https://www.okta.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/okta/jobs/7731263</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>be4a55b1-c38</externalid>
      <Title>Partner Enablement Scaled Programs Lead</Title>
      <Description><![CDATA[<p>As the Scaled Partner Enablement Lead, you will be part of a visionary team that is architecting our partner enablement strategy. You will design, build, and execute high-impact learning programs spanning technical certifications, sales readiness, specialization, and onboarding programs to ensure our partners sell and deliver Databricks solutions with total confidence.\n\nThis role demands a master of strategic planning and operational excellence. You will manage complex global initiatives, collaborate with cross-functional stakeholders, and translate business needs into world-class learning experiences. You’re joining a high-performance team that recently tripled our trained partner base in just 12 months, using innovative learning models to transform the Databricks ecosystem at scale.\n\n### Key Responsibilities\n\n<em>   Strategic Alignment: Partner with the leadership, partner-facing teams, and partners themselves to identify critical needs and translate business requirements into high-impact enablement programs.\n</em>   Performance Accountability: Drive and own the results for your partner portfolio, maintaining full accountability for key metrics, including training completion and certification targets.\n<em>   Build Learning Programs: Collaborate with internal Subject Matter Experts (SMEs) to define best-in-class enablement programs and establish rigorous success measures.\n</em>   Roadmap Management: Lead the roadmap for training initiatives, including large-scale learning events, global workshop schedules, and the Go-to-Market strategy for partner enablement programs.\n<em>   Operational Excellence: Oversee the end-to-end program lifecycle from initial intake and resourcing to scheduling, logistics, and task management, ensuring seamless execution.\n</em>   Data-Driven Insights: Monitor and analyze KPIs such as Certification rates, ROI, and program impact.\n<em>   Quality Assurance: Evaluate, audit, and coach cross-functional training resources to ensure the delivery of &quot;best-in-class&quot; learning experiences.\n\n### Minimum Qualifications\n\n</em>   Education: Bachelor’s degree in a technical discipline or equivalent practical experience.\n<em>   Experience: 6+ years of experience developing and running large-scale training and certification programs within a global tech ecosystem.\n</em>   Ecosystem Knowledge: Strong understanding of the System Integrator business model and how technical partners successfully go to market.\n<em>   Technical knowledge: Deep understanding of Data AI technologies, including topics as Data Warehouse, Data Transformation, Machine Learning, and Generative AI, ideally having been certified in at least one of these technologies.\n</em>   Program Management: Proven track record of managing technical competency models and learning journeys for geographically dispersed teams.\n<em>   Communication: Exceptional verbal and written communication skills with the ability to influence cross-functional stakeholders.\n\n### Other desired Qualifications\n\n</em>   Experience specifically within the Data + AI or Cloud Infrastructure space.\n<em>   Familiarity with scaling programs within the Cloud hypercallers such as Google, Microsoft, and AWS, or similar cloud partner ecosystems.\n</em>   Proficiency in data visualization tools to report on program ROI.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>strategic planning, operational excellence, program management, data analysis, communication, technical certifications, sales readiness, specialization, onboarding programs, Data AI technologies, Data Warehouse, Data Transformation, Machine Learning, Generative AI, Cloud Infrastructure, Google, Microsoft, AWS, data visualization tools</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. It has over 10,000 customers worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/7839714002</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b1fa4435-fc2</externalid>
      <Title>Business Systems Analyst, Data Enrichment</Title>
      <Description><![CDATA[<p>We are seeking a Business Systems Analyst, Data Enrichment to own and drive the strategy, architecture, and execution of our data enrichment ecosystem. This role sits at the intersection of Revenue Operations, Data Engineering, and Go-to-Market strategy, and is responsible for building and maintaining a best-in-class enrichment infrastructure that delivers a reliable, comprehensive source of truth for company and contact data across global markets.</p>
<p>You will be the subject matter expert and product owner for all enrichment tools, data sources, and processes,including platforms like Clay, Dun &amp; Bradstreet, ZoomInfo, and other third-party providers. You will design and operate the systems that power account hierarchies, firmographic enrichment, contact discovery, and signal detection, ensuring our GTM teams have the accurate, complete data they need to identify, prioritize, and close business.</p>
<p>This is a hands-on, technically-oriented role that requires deep experience working with large datasets, complex system integrations, and Salesforce data modeling. You will collaborate closely with Sales, Marketing, Data Science, Data Engineering, and Revenue Operations to ensure our enrichment strategy supports both near-term GTM execution and long-term data infrastructure goals.</p>
<p>Responsibilities:</p>
<ul>
<li>Own the end-to-end enrichment strategy and roadmap, serving as the product owner for all enrichment tools, vendors, and data sources including Clay, Dun &amp; Bradstreet, ZoomInfo, and emerging providers</li>
</ul>
<ul>
<li>Build and maintain a unified enrichment master,a reliable source of truth for company and person data including parent-child account hierarchies, firmographics, technographics, and contact intelligence across domestic and international markets</li>
</ul>
<ul>
<li>Design and implement waterfall enrichment workflows that orchestrate multiple data providers to maximize coverage, accuracy, and cost efficiency while minimizing redundancy</li>
</ul>
<ul>
<li>Architect enrichment data models within Salesforce, making strategic decisions about how enrichment data is stored, related, and surfaced (e.g., custom objects vs. direct field integration, parent account structures, enrichment audit trails)</li>
</ul>
<ul>
<li>Hands-on data manipulation and transformation,write queries, build data pipelines, and work directly with data warehouses (e.g., Snowflake, BigQuery) to clean, transform, match, and deduplicate enrichment data at scale</li>
</ul>
<ul>
<li>Lead international enrichment strategy, addressing the unique challenges of enriching company and contact data across global markets with varying data availability, provider coverage, and regulatory requirements</li>
</ul>
<ul>
<li>Partner with Data Science and Data Engineering to define enrichment schemas, resolve entity matching challenges, and build scalable infrastructure that supports both real-time and batch enrichment processes</li>
</ul>
<ul>
<li>Collaborate with Sales, Marketing, and Revenue Operations to understand GTM data needs, translate business requirements into enrichment solutions, and ensure enrichment outputs directly support pipeline generation, territory planning, lead routing, and account scoring</li>
</ul>
<ul>
<li>Define and track enrichment KPIs including match rates, data completeness, freshness, accuracy, and downstream GTM impact,using metrics to continuously improve the enrichment ecosystem</li>
</ul>
<ul>
<li>Evaluate and onboard new enrichment vendors and data sources, conducting proof-of-concept testing and negotiating contracts in partnership with procurement</li>
</ul>
<ul>
<li>Explore and implement AI-powered enrichment capabilities, including prompt-based enrichment using LLMs to supplement traditional data providers for emerging companies, startups, and hard-to-enrich segments</li>
</ul>
<p>You may be a good fit if you have:</p>
<ul>
<li>10+ years of experience in data enrichment, data operations, or revenue/marketing operations with hands-on ownership of enrichment tools and strategy in a B2B SaaS or enterprise technology environment</li>
</ul>
<ul>
<li>Deep expertise with enrichment platforms such as Clay, Dun &amp; Bradstreet (D-U-N-S, Data Blocks, hierarchies), ZoomInfo, Clearbit, People Data Labs, or comparable providers, including experience building waterfall enrichment workflows and enrichment masters</li>
</ul>
<ul>
<li>Strong Salesforce experience (required),including data modeling for enrichment (custom objects, account hierarchies, parent-child relationships), integration architecture, and understanding of how enrichment data flows through the CRM to support GTM processes</li>
</ul>
<ul>
<li>Hands-on technical skills for data manipulation including SQL proficiency, experience with data warehouses (Snowflake, BigQuery, or similar), and comfort working with ETL/reverse ETL pipelines, APIs, and data transformation tools</li>
</ul>
<ul>
<li>Strong product ownership mindset with experience managing roadmaps, backlogs, and stakeholder priorities,able to translate business needs into technical requirements and drive execution across cross-functional teams</li>
</ul>
<ul>
<li>Dual data + RevOps mindset,equally comfortable working with Data Science and Data Engineering on infrastructure and schema design as you are partnering with Sales and GTM teams on pipeline and territory optimization</li>
</ul>
<ul>
<li>Excellent communication skills to bridge technical and business audiences, lead stakeholder discovery sessions, and present enrichment strategy and impact to leadership</li>
</ul>
<p>Strong candidates may have:</p>
<ul>
<li>Experience building or leveraging AI-powered enrichment prompts (e.g., using LLMs to research and enrich company data, identify signals, or fill gaps where traditional providers lack coverage)</li>
</ul>
<ul>
<li>Familiarity with data quality and MDM (Master Data Management) frameworks and tools</li>
</ul>
<ul>
<li>Experience with routing and scoring tools such as LeanData, and marketing automation platforms</li>
</ul>
<ul>
<li>Background in startup signal detection,identifying high-potential early-stage companies through funding, hiring, technographic, and intent signals</li>
</ul>
<p>The annual compensation range for this role is listed below.</p>
<p>For sales roles, the range provided is the role’s On Target Earnings (&quot;OTE&quot;) range, meaning that the range includes both the sales commissions/sales bonuses target and annual base salary for the role.</p>
<p>Annual Salary: $190,000-$270,000 USD</p>
<p>Logistics</p>
<p>Minimum education: Bachelor’s degree or an equivalent combination of education, training, and/or experience</p>
<p>Required field of study: A field relevant to the role as demonstrated through coursework, training, or professional experience</p>
<p>Minimum years of experience: Years of experience required will correlate with the internal job level requirements for the position</p>
<p>Location-based hybrid policy: Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices.</p>
<p>Visa sponsorship: We do sponsor visas! However, we aren’t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>
<p>We encourage you to apply even if you do not believe you meet every single qualification. Not all strong candidates will meet every single qualification as listed. Research shows that people who identify as being from underrepresented groups are more prone to experiencing imposter syndrome and doubting the strength of their candidacy, so we urge you not to exclude yourself prematurely and to submit an application if you’re interested in this work. We think AI systems like the one</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$190,000-$270,000 USD</Salaryrange>
      <Skills>data enrichment, data operations, revenue/marketing operations, enrichment tools, enrichment strategy, salesforce, sql, data warehouses, etl/reverse etl pipelines, apis, data transformation tools, product ownership, roadmaps, backlogs, stakeholder priorities, technical requirements, cross-functional teams, data science, data engineering, infrastructure, schema design, pipeline and territory optimization, communication skills, technical and business audiences, stakeholder discovery sessions, present enrichment strategy and impact to leadership, ai-powered enrichment, llms, prompt-based enrichment, emerging companies, startups, hard-to-enrich segments, data quality, mdm frameworks, routing and scoring tools, marketing automation platforms, startup signal detection</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.co.png</Employerlogo>
      <Employerdescription>Anthropic is a technology company that aims to create reliable, interpretable, and steerable AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.co/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5127289008</Applyto>
      <Location>San Francisco, CA | New York City, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6d4e84e5-9fa</externalid>
      <Title>Pipeline Engineer (3D Data)</Title>
      <Description><![CDATA[<p>We&#39;re looking for a 3D Data Pipeline Engineer to design, build, and operate the core systems that enable high-quality 3D data processing, synthetic data generation, and rendering across our products.</p>
<p>This is a hands-on role for someone who is passionate about large-scale 3D data, system performance, and delivering reliable data pipelines to power our product features.</p>
<p>You&#39;ll work closely with product engineers, 3D artists, and research scientists to design efficient, robust, and scalable data pipeline capabilities,while keeping data integrity and performance high in a fast-moving startup environment.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Design, build, and operate automated pipelines for 3D data ingestion, cleaning, processing, validation, and delivery that sit on the critical path for model training.</li>
<li>Own foundational capabilities for synthetic data generation, including developing tools, workflows, and quality metrics to produce high-fidelity training data at scale.</li>
<li>Develop and optimize high-performance rendering systems and services for real-time visualization and asset generation.</li>
<li>Architect and operate distributed data systems for handling massive volumes of 3D models, textures, and associated metadata, ensuring data consistency and robust failure recovery.</li>
<li>Own data quality and production readiness end-to-end: defining data schemas, implementing quality checks, capacity planning, observability, and continuous improvement for the 3D pipeline.</li>
<li>Improve developer and researcher velocity by building shared abstractions, tooling, and guardrails that reduce the operational and cognitive load of working with 3D assets.</li>
<li>Collaborate with cross-functional teams to integrate the 3D data pipeline with other core product platforms and services.</li>
<li>Set technical direction, mentor engineers, and raise the data engineering bar across the product org with a focus on 3D data.</li>
</ul>
<p><strong>Key Qualifications:</strong></p>
<ul>
<li>6+ years of experience building and operating large-scale data pipelines, especially with a focus on 3D, graphics, or simulation data, with deep experience designing scalable, distributed services in production.</li>
<li>Strong programming skills in Python and/or C++, and a solid foundation in data engineering principles and distributed systems architecture.</li>
<li>Hands-on experience with 3D data processing libraries, game engines (e.g., Unity, Unreal), or rendering APIs (e.g., OpenGL, Vulkan).</li>
<li>Experience with cloud-based data storage and processing solutions (e.g., Kubernetes, distributed file systems, data warehouses).</li>
<li>Experience working in fast-moving or startup environments, ideally having led systems or products from early design through production and growth.</li>
<li>A high bar for ownership and execution: you’re comfortable with ambiguity, take responsibility for outcomes, and drive work forward without waiting for perfect clarity.</li>
<li>A product-first mindset: you care about data quality, pipeline reliability, and performance as core product features, not afterthoughts.</li>
<li>Enjoy collaborating with a small, high-ownership team and raising the quality bar through code, data design, and example.</li>
</ul>
<p><strong>Who You Are:</strong></p>
<ul>
<li>Fearless Innovator: We need people who thrive on challenges and aren&#39;t afraid to tackle the impossible.</li>
<li>Resilient Builder: Impacting Large World Models isn&#39;t a sprint; it&#39;s a marathon with hurdles. We&#39;re looking for builders who can weather the storms of groundbreaking research and come out stronger.</li>
<li>Mission-Driven Mindset: Everything we do is in service of creating the best spatially intelligent AI systems, and using them to empower people.</li>
<li>Collaborative Spirit: We&#39;re building something bigger than any one person. We need team players who can harness the power of collective intelligence.</li>
</ul>
<p>We&#39;re hiring the brightest minds from around the globe to bring diverse perspectives to our cutting-edge work. If you&#39;re ready to work on technology that will reshape how machines perceive and interact with the world, World Labs is your launchpad.</p>
<p>Join us, and let&#39;s make history together.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$250k-$350k base salary (good-faith estimate for San Francisco Bay Area upon hire; actual offer based on experience, skills, and qualifications)</Salaryrange>
      <Skills>Python, C++, 3D data processing, Game engines (e.g., Unity, Unreal), Rendering APIs (e.g., OpenGL, Vulkan), Cloud-based data storage and processing solutions (e.g., Kubernetes, distributed file systems, data warehouses)</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>World Labs</Employername>
      <Employerlogo>https://logos.yubhub.co/worldlabs.ai.png</Employerlogo>
      <Employerdescription>World Labs builds foundational world models that can perceive, generate, reason, and interact with the 3D world.</Employerdescription>
      <Employerwebsite>https://www.worldlabs.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/worldlabs/jobs/4110240009</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>ceba9e5b-250</externalid>
      <Title>Senior Backend Engineer, Product and Infra</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Senior Backend Engineer to build the systems and services that power our product experience. You&#39;ll own the backend infrastructure that makes our content discoverable, our features responsive, and our platform reliable at scale.</p>
<p>Your work will directly shape what users experience: designing APIs that serve rich content, building services that handle real-time interactions, implementing content-matching systems for rights and safety, and ensuring our platform performs under load. You&#39;ll architect systems that are fast, correct, and maintainable.</p>
<p>You&#39;ll collaborate closely with Product, ML Research, and Mobile/Web teams to ship features that matter. We use Python, Go, BigQuery, Pub/Sub, and a microservices architecture,but we care more about good judgment than specific tool experience.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design and maintain application-level data models that organize rich content into canonical structures optimized for product features, search, and retrieval.</li>
<li>Build high-reliability ETLs and streaming pipelines to process usage events, analytics data, behavioral signals, and application logs.</li>
<li>Develop data services that expose unified content to the application, such as metadata access APIs, indexing workflows, and retrieval-ready representations.</li>
<li>Implement and refine fingerprinting pipelines used for deduplication, rights attribution, safety checks, and provenance validation.</li>
<li>Own data consistency between ingestion systems, application surfaces, metadata storage, and downstream reporting environments.</li>
<li>Define and track key operational metrics, including latency, completeness, accuracy, and event health.</li>
<li>Collaborate with Product teams to ensure content structures and APIs support evolving features and high-quality user experiences.</li>
<li>Partner with Analytics and Research teams to deliver clean usage datasets for experimentation, model evaluation, reporting, and internal insights.</li>
<li>Operate large analytical workloads in BigQuery and build reusable Dataflow/Beam components for structured processing.</li>
<li>Improve reliability and scale by designing robust schema evolution strategies, idempotent pipelines, and well-instrumented operational flows.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Experience building production backend services and APIs at scale</li>
<li>Experience building ETL/ELT pipelines, event processing systems, and structured data models for applications or analytics</li>
<li>Strong background in data modeling, metadata systems, indexing, or building canonical representations for heterogeneous content</li>
<li>Proficiency in Python, Go, SQL, and scalable data-processing frameworks (Dataflow/Beam, Spark, or similar)</li>
<li>Familiarity with BigQuery or other analytical data warehouses and strong comfort optimizing large queries and schemas</li>
<li>Experience with event-driven architectures, Pub/Sub, or Kafka-like systems</li>
<li>Strong understanding of data quality, schema evolution, lineage, and operational reliability</li>
<li>Ability to design pipelines that balance cost, latency, correctness, and scale</li>
<li>Clear communication skills and an ability to collaborate closely with Product, Research, and Analytics stakeholders</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Experience building application-facing APIs or microservices that expose structured content</li>
<li>Background in information retrieval, indexing systems, or search infrastructure</li>
<li>Experience with fingerprinting, perceptual hashing, audio similarity metrics, or content-matching algorithms</li>
<li>Familiarity with ML workflows and how downstream analytics and usage data feed back into research pipelines</li>
<li>Understanding of batch + streaming architectures and how to blend them effectively</li>
<li>Experience with Go, Next.js, or React Native for occasional full-stack contributions</li>
</ul>
<p><strong>Why Join Us</strong></p>
<p>You will design the core data services and pipelines that power our product experience, analytics, and business operations. You’ll work on high-impact data challenges involving real-time signals, large-scale metadata systems, and cross-platform consistency. You’ll join a small, fast-moving team where you’ll shape the structure, reliability, and intelligence of our downstream data ecosystem.</p>
<p><strong>Benefits</strong></p>
<ul>
<li>Highly competitive salary and equity</li>
<li>Quarterly productivity budget</li>
<li>Flexible time off</li>
<li>Fantastic office location in Manhattan</li>
<li>Productivity package, including ChatGPT Plus, Claude Code, and Copilot</li>
<li>Top-notch private health, dental, and vision insurance for you and your dependents</li>
<li>401(k) plan options with employer matching</li>
<li>Concierge medical/primary care through One Medical and Rightway</li>
<li>Mental health support from Spring Health</li>
<li>Personalized life insurance, travel assistance, and many other perks</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,000 - $220,000</Salaryrange>
      <Skills>Python, Go, BigQuery, Pub/Sub, Data modeling, Metadata systems, Indexing, Canonical representations, ETL/ELT pipelines, Event processing systems, Structured data models, Scalable data-processing frameworks, Analytical data warehouses, Event-driven architectures, Kafka-like systems, Data quality, Schema evolution, Lineage, Operational reliability, Application-facing APIs, Microservices, Information retrieval, Indexing systems, Search infrastructure, Fingerprinting, Perceptual hashing, Audio similarity metrics, Content-matching algorithms, ML workflows, Batch + streaming architectures</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Udio</Employername>
      <Employerlogo>https://logos.yubhub.co/udio.com.png</Employerlogo>
      <Employerdescription>Udio is a technology company that powers product experiences.</Employerdescription>
      <Employerwebsite>https://www.udio.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/udio/jobs/4987729008</Applyto>
      <Location>New York</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>78a9b8f2-81c</externalid>
      <Title>Senior Software Engineer - Data Infrastructure</Title>
      <Description><![CDATA[<p>We believe that the way people interact with their finances will drastically improve in the next few years. We&#39;re dedicated to empowering this transformation by building the tools and experiences that thousands of developers use to create their own products.</p>
<p>Plaid powers the tools millions of people rely on to live a healthier financial life. We work with thousands of companies like Venmo, SoFi, several of the Fortune 500, and many of the largest banks to make it easy for people to connect their financial accounts to the apps and services they want to use.</p>
<p>Making data driven decisions is key to Plaid&#39;s culture. To support that, we need to scale our data systems while maintaining correct and complete data. We provide tooling and guidance to teams across engineering, product, and business and help them explore our data quickly and safely to get the data insights they need, which ultimately helps Plaid serve our customers more effectively.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Contribute towards the long-term technical roadmap for data-driven and machine learning iteration at Plaid</li>
<li>Leading key data infrastructure projects such as improving ML development golden paths, implementing offline streaming solutions for data freshness, building net new ETL pipeline infrastructure, and evolving data warehouse or data lakehouse capabilities.</li>
<li>Working with stakeholders in other teams and functions to define technical roadmaps for key backend systems and abstractions across Plaid.</li>
<li>Debugging, troubleshooting, and reducing operational burden for our Data Platform.</li>
<li>Growing the team via mentorship and leadership, reviewing technical documents and code changes.</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>5+ years of software engineering experience</li>
<li>Extensive hands-on software engineering experience, with a strong track record of delivering successful projects within the Data Infrastructure or Platform domain at similar or larger companies.</li>
<li>Deep understanding of one of: ML Infrastructure systems, including Feature Stores, Training Infrastructure, Serving Infrastructure, and Model Monitoring OR Data Infrastructure systems, including Data Warehouses, Data Lakehouses, Apache Spark, Streaming Infrastructure, Workflow Orchestration.</li>
<li>Strong cross-functional collaboration, communication, and project management skills, with proven ability to coordinate effectively.</li>
<li>Proficiency in coding, testing, and system design, ensuring reliable and scalable solutions.</li>
<li>Demonstrated leadership abilities, including experience mentoring and guiding junior engineers.</li>
</ul>
<p><strong>Additional Information</strong></p>
<p>Our mission at Plaid is to unlock financial freedom for everyone. To support that mission, we seek to build a diverse team of driven individuals who care deeply about making the financial ecosystem more equitable.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$190,800-$286,800 per year</Salaryrange>
      <Skills>ML Infrastructure systems, Data Infrastructure systems, Apache Spark, Streaming Infrastructure, Workflow Orchestration, Feature Stores, Training Infrastructure, Serving Infrastructure, Model Monitoring, Data Warehouses, Data Lakehouses</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Plaid</Employername>
      <Employerlogo>https://logos.yubhub.co/plaid.com.png</Employerlogo>
      <Employerdescription>Plaid builds tools and experiences that thousands of developers use to create their own products, connecting financial accounts to apps and services.</Employerdescription>
      <Employerwebsite>https://plaid.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/plaid/05b0ae3f-ec60-48d6-ae27-1bd89d928c47</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>7b750523-8ff</externalid>
      <Title>Staff Software Engineer, Data Engineering</Title>
      <Description><![CDATA[<p>We are seeking a Staff Software Engineer to lead the technical strategy and implementation of our enterprise data architecture, governance foundations, and analytics enablement tooling.</p>
<p>In this role, you will be the primary engineering counterpart to the Senior Product Manager for Data Enablement &amp; Governance, jointly shaping the roadmap for enterprise analytics, shared definitions, and the tools that help Omada answer questions faster and more reliably.</p>
<p>You will design and evolve core data products, define patterns and standards used across the company, and drive the technical execution of initiatives that ensure our metrics, reports, and data products are scalable, governed, and trustworthy.</p>
<p>This is a high-impact, cross-functional Staff role working across Data Engineering, Data Science, Analytics, Product, IT, and business leaders.</p>
<p><strong>Key Responsibilities:</strong></p>
<p><strong>Enterprise Data Architecture</strong></p>
<ul>
<li>Own the vision and technical roadmap for Omada&#39;s enterprise data architecture, spanning ingestion, storage, modeling, and serving layers for analytics and applied statistics use cases.</li>
<li>Design, implement, and evolve scalable, secure, and cost-efficient data solutions (datalakes, warehouses, marts, semantic layers) that support governed, cross-functional analytics and self-service.</li>
<li>Define and socialize architectural patterns, data contracts, and integration standards used by data and product teams across the organization.</li>
<li>Anticipate future needs (e.g., new product lines, new modalities, AI/ML workloads) and drive proactive architectural changes rather than reacting to incidents or point-in-time requests.</li>
</ul>
<p><strong>Data Modeling, Quality, and Governance Foundations</strong></p>
<ul>
<li>Lead the design of logical and physical data models to support enterprise metrics, dashboards, and ad hoc analytics, with a focus on reusability and clear ownership.</li>
<li>Implement robust data quality, validation, and monitoring frameworks that underpin trusted “single source of truth” definitions for core concepts (e.g., active member, MAU, GLP-1 member).</li>
<li>Partner with the Senior Product Manager, Data Enablement &amp; Governance to translate governance decisions (definitions, ownership, change-management processes) into concrete technical implementations in the data platform.</li>
<li>Set standards and review mechanisms to ensure new pipelines, marts, and reports align with enterprise definitions and governance policies.</li>
<li>Continuously improve performance, scalability, and cost-efficiency of data workflows and storage; lead deep dives and remediation for complex production issues.</li>
</ul>
<p><strong>Enterprise Data Products Lifecycle</strong></p>
<ul>
<li>In close partnership with the Senior PM, define and deliver core, reusable data products (e.g., engagement, clinical, financial, client, care delivery datasets) that power dashboards, reporting, and self-service analytics.</li>
<li>Co-Architect and implement technical foundations for AI-assisted analytics tools, governed semantic layers, and reporting applications that make analysts and business users more efficient.</li>
<li>Partner with Product and Engineering teams owning tools like Amplitude, Tableau, and internal reporting tools to ensure consistent instrumentation, mapping to enterprise definitions, and scalable access patterns.</li>
<li>Translate business and product requirements into resilient schemas, data services, and interfaces that are usable, maintainable, and auditable.</li>
<li>Ensure production data delivery meets defined SLAs and supports downstream BI, reporting apps, and applied statistics workloads.</li>
<li>Play a key role in cross-functional forums (e.g., Data Governance Committee, analytics communities) as the technical voice for feasibility, risk, and long-term platform health.</li>
</ul>
<p><strong>Technical Leadership, Mentorship, and Culture</strong></p>
<ul>
<li>Lead large, multi-team technical initiatives,from design to implementation and rollout,setting a high bar for design docs, reviews, and execution quality.</li>
<li>Mentor senior and mid-level engineers, elevating the team’s skills in data modeling, pipeline design, governance, and platform thinking.</li>
<li>Help shape playbooks for how product squads and spokes engage with central data teams on new metrics, data products, and applied stats projects.</li>
<li>Partner closely with Analytics, Data Science, Product, and business leaders to ensure data architecture and governance decisions are aligned with company OKRs and measurable business value.</li>
<li>Proactively identify complexity, duplication, and fragility in existing systems; drive simplification and standardization with sustainable solutions.</li>
<li>Model Omada’s values in day-to-day work, fostering a culture of trust, context-seeking, bold thinking, and high-impact delivery.</li>
</ul>
<p><strong>About You:</strong></p>
<ul>
<li>8+ years of experience building, maintaining, and orchestrating scalable data platforms and high-quality production pipelines, including significant experience in analytics or warehousing environments.</li>
<li>Demonstrated Staff-level impact: leading cross-team technical initiatives, making architectural decisions that shaped a multi-year roadmap, and influencing stakeholders beyond your immediate team.</li>
<li>Deep experience with cloud data ecosystems (e.g., AWS) and modern data warehouses (e.g., Redshift, Snowflake, BigQuery), including MPP query optimization.</li>
<li>Strong background in data modeling for OLTP and OLAP, and designing reusable data products for BI, reporting, and advanced analytics.</li>
<li>Hands-on experience implementing data quality, observability, and governance frameworks, ideally in a regulated or PHI/PII-sensitive environment.</li>
<li>Experience partnering with Product Management and Analytics to define and deliver platform capabilities, not just point solutions.</li>
</ul>
<p><strong>Technical Skills:</strong></p>
<ul>
<li>Strong proficiency in SQL (analytical and performance-tuned) and experience with relational and MPP databases.</li>
<li>Proficiency in at least one modern programming language used in data engineering (e.g., Python, Java, Scala) and comfort applying software engineering best practices (testing, CI/CD, code review).</li>
<li>Experience with workflow orchestration and data integration tools (e.g., Airflow) and event-driven or streaming patterns where appropriate.</li>
<li>Familiarity with BI and analytics tools (e.g., Tableau, Amplitude, or similar) and how they integrate with governed data layers.</li>
<li>Experience with data governance concepts (ownership, lineage, definitions, access controls) and their technical implementation in a modern data stack.</li>
<li>Familiarity with AI tools for development.</li>
</ul>
<p><strong>Communication &amp; Working Style:</strong></p>
<ul>
<li>Excellent communication and collaboration skills, with the ability to convey complex technical concepts to non-technical stakeholders.</li>
<li>Highly self-directed and comfortable operating in ambiguous, cross-functional problem spaces, creating clarity and direction where none exists.</li>
<li>Strong sense of ownership and bias for impact; you care about outcomes for members, customers, and internal users, not just elegant systems.</li>
</ul>
<p><strong>Benefits:</strong></p>
<ul>
<li>Competitive salary with generous annual cash bonus</li>
<li>Equity grants</li>
<li>Remote first work from home culture</li>
<li>Flexible Time Off to help you recharge</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Cloud data ecosystems, Modern data warehouses, MPP query optimization, Data modeling, Data quality, Data governance, Workflow orchestration, Data integration, Event-driven or streaming patterns, BI and analytics tools, AI tools for development</Skills>
      <Category>Engineering</Category>
      <Industry>Healthcare</Industry>
      <Employername>Omada Health</Employername>
      <Employerlogo>https://logos.yubhub.co/omadahealth.com.png</Employerlogo>
      <Employerdescription>Omada Health is a healthcare technology company that provides digital therapeutics for chronic disease management.</Employerdescription>
      <Employerwebsite>https://www.omadahealth.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/omadahealth/jobs/7753330</Applyto>
      <Location>Remote, USA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>5ec63ea6-5a3</externalid>
      <Title>Data Engineer</Title>
      <Description><![CDATA[<p>At Neighbor, we&#39;re building the largest hyperlocal marketplace the world has ever seen. As a Data Engineer, you will be the core engineering resource responsible for building, scaling, and optimizing the data infrastructure that transforms raw events into high-fidelity, actionable intelligence.</p>
<p>This engineering resource will be the cornerstone of our data infrastructure, responsible for extraction, transform, and load of the data that powers our nation-wide, best-in-class marketplace. By implementing software engineering best practices and scalable solutions, this role is critical in empowering the CEO, executive team, managers, and individual contributors with the robust and trustworthy intelligence needed to scale and innovate across our marketplace.</p>
<p><strong>Primary Responsibilities</strong></p>
<ul>
<li>Design, implement, and maintain scalable data transformation layers and code-first orchestration frameworks to ensure the delivery of high-fidelity, reusable data models</li>
<li>Design and build robust pipelines to ingest data from diverse sources (APIs, logs, relational DBs)</li>
<li>Ensure the reliable and timely execution of all critical data pipelines (ETLs/ELTs) to maintain data integrity and freshness</li>
<li>Standardize analytics workflows by integrating software engineering best practices, including version control, CI/CD pipelines, and automated data validation protocols</li>
<li>Develop and refine a robust semantic layer to facilitate self-service analytics, enabling stakeholders to derive insights without exposure to underlying architectural complexities</li>
<li>Monitor and optimize cloud compute utilization and data model performance to ensure high availability and low-latency reporting during periods of rapid data scaling</li>
<li>Serve as a strategic technical partner to leadership across Product, Engineering, Marketing, and Finance to align data infrastructure with organizational objectives</li>
<li>Become a subject matter expert on the product ecosystem, user behavior, and marketing life cycles to better translate raw data into business value</li>
<li>Serve as a versatile technical resource capable of stepping into the Data Analyst capacity when necessary,performing deep-dive quantitative analysis and building sophisticated visualizations to support executive decision-making</li>
<li>Mentor the data analytics team on advanced technical methodologies to foster a culture of engineering excellence and data autonomy</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>3+ years of experience in data engineering or analytics engineering</li>
<li>Bachelor&#39;s degree in quantitative and/or technical fields (Math, Physics, Statistics, Economics, Computer Science, Engineering, etc.) OR 5+ years work experience as a Data Engineer</li>
<li>Expert-level mastery of SQL, with the ability to write, tune, and optimize complex queries for high-volume environments</li>
<li>Strong command of at least one major programming language used for data processing</li>
<li>Hands-on experience designing and maintaining data lakes or cloud-based data warehouses</li>
<li>Deep understanding of data integration patterns, including data ingestion, transformation, and automated cleansing (ETL/ELT)</li>
<li>Experience applying scientific, mathematical, or statistical techniques to analyze data and build predictive models</li>
<li>Advanced ability to translate complex datasets into actionable narratives using modern business intelligence and reporting tools</li>
<li>A proven track record of using quantitative analysis to solve ambiguous problems and drive strategic decision-making in a fast-paced environment</li>
<li>Exceptional ability to collaborate with non-technical stakeholders, translating business requirements into technical specs and vice versa</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Generous Stock options</li>
<li>Medical, dental, and vision insurance</li>
<li>Generous PTO</li>
<li>11 paid company holidays</li>
<li>Hybrid work model - WFH every Monday</li>
<li>401(k) plan</li>
<li>Infant care leave</li>
<li>On-site gym/showers open 24/7</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Programming languages, Data lakes, Cloud-based data warehouses, Data integration patterns, Scientific, mathematical, or statistical techniques, Business intelligence and reporting tools</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Neighbor</Employername>
      <Employerlogo>https://logos.yubhub.co/neighbor.com.png</Employerlogo>
      <Employerdescription>Neighbor is a marketplace for self storage and parking, operating across almost every U.S. city.</Employerdescription>
      <Employerwebsite>https://neighbor.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/neighbor/da1304b7-89ad-4ac0-99e8-9c0cf8284f1c</Applyto>
      <Location>U.S.</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>2854e5c8-3c7</externalid>
      <Title>Solution Operations</Title>
      <Description><![CDATA[<p>About Mistral AI</p>
<p>At Mistral AI, we believe in the power of AI to simplify tasks, save time, and enhance learning and creativity. Our technology is designed to integrate seamlessly into daily working life.</p>
<p>Role Summary</p>
<p>As a Solution Operations, you will serve as a strategic business partner to our Solution team. This team, composed of AI Deployment Strategists, Infrastructure Solution Architects, and Applied AI Engineers, designs, deploys, and optimizes AI solutions that directly solve our enterprise customers&#39; most complex challenges.</p>
<p>Responsibilities</p>
<p>Worldwide Strategic Staffing &amp; Capacity Planning - across all geographies</p>
<ul>
<li><p>Develop and execute a forward-looking staffing strategy aligned with business forecasts, including resource allocation, staffing ratios across regions, and technical deployment metrics.</p>
</li>
<li><p>Build and maintain a prioritization framework for recruitment (role type, geography) to ensure the Solutions team is staffed for high-impact customer engagements.</p>
</li>
<li><p>Accelerate time-to-value by minimizing time-to-staff through efficient matching systems between skills and customer requirements.</p>
</li>
</ul>
<p>Data-Driven Staffing Optimization</p>
<ul>
<li><p>Identify and operationalize key metrics to measure staffing efficiency, team utilization, and impact on customer success and revenue.</p>
</li>
<li><p>Create and maintain reporting mechanisms to track staffing KPIs, including time-to-staff, skill gaps, and deployment success rates.</p>
</li>
<li><p>Synthesize and implement actionable recommendations to improve staffing processes and cross-functional alignment.</p>
</li>
</ul>
<p>Scalable Staffing Systems &amp; Automation</p>
<ul>
<li><p>Design and implement scalable processes, automations, and tools to streamline talent deployment and reduce operational friction.</p>
</li>
<li><p>Optimize the Mistraler lifecycle (onboarding to project allocation) by ensuring seamless transitions and maximizing team productivity.</p>
</li>
<li><p>Identify and eliminate bottlenecks in staffing workflows, leveraging automation to enhance agility and responsiveness to customer needs.</p>
</li>
</ul>
<p>Cross-Functional Collaboration</p>
<ul>
<li><p>Partner with Sales, Product, Revenue Operations, Talent Acquisition and HR to align staffing capabilities with customer demands and business priorities.</p>
</li>
<li><p>Support the development of the Solution team offerings and technical engagement models that maximize customer success.</p>
</li>
<li><p>Collaborate with Product teams to ensure Solutions team feedback is integrated into the product roadmap.</p>
</li>
</ul>
<p>Prepare materials for executive reviews, highlighting staffing successes, challenges, and strategic recommendations.</p>
<p>About you</p>
<ul>
<li><p>3+ years of experience in strategic operations, staffing, chief of staff or resource management within technical sales, solutions engineering, or professional services environments.</p>
</li>
<li><p>Proven track record in optimizing processes, impacting business KPIs, and aligning resources with business priorities.</p>
</li>
<li><p>Experience in staffing, resource management, or talent deployment is a strong plus.</p>
</li>
<li><p>Strong analytical skills, with experience in Salesforce, data warehouses, and BI tools (e.g., Looker, Hex, BigQuery).</p>
</li>
<li><p>Exceptional negotiation and diplomacy skills: ability to navigate complex stakeholder dynamics and align competing priorities.</p>
</li>
<li><p>Hands-on, execution-focused mindset with a bias for action in ambiguous, fast-changing environments.</p>
</li>
<li><p>Technical acumen: understanding of AI implementation, use cases, and scaling challenges is a strong plus.</p>
</li>
<li><p>Bachelor’s degree required; MBA or advanced degree preferred.</p>
</li>
<li><p>Fluency in English (native level); additional European languages are a plus.</p>
</li>
</ul>
<p>Benefits</p>
<p>France</p>
<ul>
<li><p>Competitive cash salary and equity</p>
</li>
<li><p>Daily lunch vouchers: Swile meal vouchers with 10,83€ per worked day, incl 60% offered by company</p>
</li>
<li><p>Sport: Enjoy discounted access to gyms and fitness studios through our Wellpass partnership</p>
</li>
<li><p>Transportation: Monthly contribution to a mobility pass via Betterway</p>
</li>
<li><p>Health: Full health insurance for you and your family</p>
</li>
<li><p>Parental: Generous parental leave policy</p>
</li>
<li><p>Visa sponsorship</p>
</li>
<li><p>Coaching: we offer BetterUp coaching on a voluntary basis</p>
</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>strategic operations, staffing, resource management, talent deployment, salesforce, data warehouses, BI tools, Looker, Hex, BigQuery</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mistral AI</Employername>
      <Employerlogo>https://logos.yubhub.co/mistral.ai.png</Employerlogo>
      <Employerdescription>Mistral AI is an AI technology company that provides high-performance, optimized, open-source and cutting-edge models, products and solutions for enterprise needs.</Employerdescription>
      <Employerwebsite>https://mistral.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/mistral/c4f5669e-305b-4e9f-9ae8-10fd50682273</Applyto>
      <Location>Paris</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>87c01508-dd0</externalid>
      <Title>Data Analyst</Title>
      <Description><![CDATA[<p>We are looking for a Data Analyst to join our Marketing Science team. As a Data Analyst, you will be the analytical lead across growth marketing, registration, and website analytics, setting the measurement framework, driving analysis independently, and translating results into decisions stakeholders can act on.</p>
<p>Your day-to-day will involve owning analytical coverage across growth marketing, registration, and website, setting measurement frameworks, building scalable reporting, and driving the analytical roadmap for your domain. You will lead analysis on complex, ambiguous questions: incrementality testing, LTV and payback modeling, attribution deep dives, funnel decomposition, and cohort analysis.</p>
<p>You will build and own dbt models that encode business logic for your domain in a scalable, reusable way to power dashboards, reporting, and stakeholder self-serve. You will build and maintain Tableau dashboards that give marketing and leadership a clear, reliable view of performance across growth, registration, and web.</p>
<p>You will partner directly with marketing stakeholders to scope analytical problems, pressure-test interpretations, and make sure findings land as decisions, not just slide decks. You will use Claude fluently as a daily productivity layer, accelerating SQL, generating documentation, QA-ing work, and building faster analytical workflows.</p>
<p>You will mentor and raise the floor for junior analysts on the team. You will review their work, help them think through ambiguous problems, and set the standard for analytical quality. You will identify gaps before they become problems, from data quality and metric definitions to reporting coverage and stakeholder understanding. You will own the fix, not just the flag.</p>
<p>We are looking for someone with 4–8 years of experience in a data or analytics role, with hands-on experience in at least one marketing channel: paid search (preferred), paid social, TV/CTV, affiliates, direct mail, or similar. You should understand how that channel&#39;s data works, how it&#39;s structured, what the key metrics are, and how performance is measured and reported. Multi-channel experience is a strong plus.</p>
<p>You should have strong, proven SQL ability. You should write clean, efficient queries across complex schemas and build analytics logic others can maintain. Dbt experience, including owning models in production, is a strong plus. You should be proficient with Claude or similar AI tools for analytical work. You should use them to move faster, not as a crutch. You should know how to validate outputs rather than ship the first answer.</p>
<p>You should write readable, well-structured Python for analytical work: data manipulation, automation, or scripting. Your code should be something a teammate can pick up and maintain. You should understand how attribution models work, what makes a good experiment, and how to think about LTV, CAC, and payback period without needing a primer.</p>
<p>You should be proactive and not wait to be asked. You should spot the gap, scope the work, and drive it to a conclusion. Your stakeholders shouldn&#39;t have to manage your backlog. You should be able to take a fuzzy business question and turn it into a clear analytical plan. You should ask the right clarifying questions and bring structure without needing the problem handed to you pre-scoped.</p>
<p>You should communicate findings in a way that drives decisions. You should know when a chart is enough and when you need a narrative. You&#39;ve presented to senior stakeholders and know how to calibrate.</p>
<p>Preferred experience includes breadth across multiple marketing channels: paid search, paid social, TV/CTV, affiliates, or others. Hands-on experience with experiment design and incrementality testing is also preferred. Experience with Snowflake or a similar cloud data warehouse is a plus. Experience mentoring or reviewing the work of junior analysts is also a plus.</p>
<p>Work perks at Greenlight include medical, dental, vision, and HSA match, paid life insurance, AD&amp;D, and disability benefits, traditional 401k with company match, unlimited PTO, paid company holidays and pop-up bonus holidays, professional development stipends, mental health resources, 1:1 financial planners, fertility healthcare, 100% paid parental and caregiving leave, plus cleaning service and meals during your leave, flexible WFH, both remote and in-office opportunities, fully stocked kitchen, catered lunches, and occasional in-office happy hours, employee resource groups.</p>
<p>Our stance on salaries is that Greenlight provides a competitive compensation package with a market-based approach to pay and will vary depending on your location, experience, and skill set. The total compensation package for this position will also include a discretionary performance bonus, equity rewards, medical benefits, 401K match, and more. Greenlight conducts continuous compensation evaluations across departments and geographies to ensure we are keeping our pay current and competitive.</p>
<p>The estimated base pay range for this position in NY, CA, WA is $125,000-145,000. The estimated base pay range for this position in CO is $125,000-135,000.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$125,000-145,000 (NY, CA, WA); $125,000-135,000 (CO)</Salaryrange>
      <Skills>SQL, dbt, Tableau, Claude, Python, data manipulation, automation, scripting, attribution models, experiment design, incrementality testing, paid search, paid social, TV/CTV, affiliates, direct mail, Snowflake, cloud data warehouse, mentoring, reviewing junior analyst work</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Greenlight</Employername>
      <Employerlogo>https://logos.yubhub.co/greenlight.com.png</Employerlogo>
      <Employerdescription>Greenlight is a family fintech company that provides a banking app for families, serving over 6 million parents and kids.</Employerdescription>
      <Employerwebsite>https://www.greenlight.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/greenlight/084768a3-1c27-4390-b94c-d987f636811a</Applyto>
      <Location>Atlanta</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>c330fc0f-900</externalid>
      <Title>Senior Manager, Customer Success Operations</Title>
      <Description><![CDATA[<p>At Eve, we&#39;re redefining what&#39;s possible in legal technology. Our mission is to empower plaintiff law firms with AI-driven solutions that elevate how they operate, serve clients, and grow.</p>
<p>We believe the future of law will be built by &#39;AI-Native Law Firms&#39; , firms that are managed, scaled, and optimized by intelligent systems rather than manual processes and endless administrative work.</p>
<p>Our technology augments the capabilities of attorneys across every stage of a case , from intake and document review to strategy and settlement , so they can focus on what truly matters: achieving the best outcomes for their clients.</p>
<p>Why Join Eve:</p>
<p>Product-market fit: Eve is used by over 550+ law firms, and we&#39;re growing fast. Backed by top investors: We&#39;ve raised over $160M from world-class partners including Spark Capital, Andreessen Horowitz(A16z), Menlo Ventures, and Lightspeed. Built by a world-class team: Engineers, designers, and operators from places like Scale, Meta, Airbnb, Cruise, Square, Rubrik, and Lyft are building Eve from the ground up. AI-Native from day one: We&#39;re on the bleeding edge of AI, collaborating directly with teams at OpenAI and Anthropic to build best-in-class AI workflows tailored for legal work. Explosive growth: We are growing 2X revenue Quarter over Quarter.</p>
<p><strong>Why This Role:</strong></p>
<p>You&#39;ll build the data, process, and tooling layer for Eve&#39;s Customer Success organization. We have 900+ accounts, a growing renewal book, and a CS team that&#39;s scaling fast. Health scores need to be rebuilt. Renewal forecasting needs to be trusted by Finance. Playbooks exist in theory but aren&#39;t operationalized. The data is there (Snowflake, product telemetry, CRM), but it&#39;s not connected to the workflows that CSMs run every day. This is not a reporting role. You&#39;ll spend most of your time designing systems: building the health score methodology that predicts churn, operationalizing playbooks, and standing up renewal forecasting that leadership and Finance trust. The rest of your time goes to capacity planning, territory design, and partnering with RevOps and Finance to make sure CS metrics are calculated consistently across the business.</p>
<p><strong>What You&#39;ll Accomplish:</strong></p>
<ul>
<li>Own CS analytics end-to-end: health scoring methodology, renewal forecasting, portfolio segmentation, and the dashboards that leadership and Finance rely on for decision-making</li>
<li>Design and operationalize playbooks that drive customer outcomes: onboarding workflows, risk intervention sequences, expansion motions, renewal execution</li>
<li>Use AI tools as part of your daily workflow: automating data pulls, building smarter alerting, finding patterns that manual analysis would miss</li>
<li>Build and maintain the CS tech stack. Clean data, working automations, tooling the team relies on.</li>
</ul>
<p>Partner with GTM Systems on shared infrastructure across the revenue stack</p>
<ul>
<li>Deliver capacity planning, territory design, and comp modeling that keeps pace with a growing team and portfolio</li>
<li>Identify patterns across the portfolio: what&#39;s driving churn and where onboarding stalls. Turn those into specific actions for CSMs and leadership</li>
<li>Partner with RevOps and Finance to make sure CS metrics (GRR, NRR, renewal forecasts) are calculated consistently, trusted across the business, and ready for executive reporting</li>
</ul>
<p><strong>What We Are Looking For:</strong></p>
<ul>
<li>5-8+ years in CS Operations, Revenue Operations, or Business Operations at a B2B SaaS company, with at least 2 years focused on Customer Success</li>
<li>Hands-on experience building health scores, renewal forecasting models, and portfolio segmentation frameworks</li>
<li>Strong SQL skills and comfort working directly with data warehouses</li>
<li>Experience implementing and administering CS platforms, including configuration, workflow design, and driving adoption across a team</li>
<li>Track record of designing processes that CSMs actually follow</li>
<li>Ability to translate data into narrative for leadership: what&#39;s happening, why it matters, what to do about it</li>
<li>Comfortable using AI tools (Claude, ChatGPT, Copilot) for analysis, automation, and workflow design</li>
</ul>
<p><strong>You&#39;ll Thrive in This Role If You Have:</strong></p>
<ul>
<li>Experience with consumption-based or usage-based pricing models</li>
<li>Background in legal technology or professional services SaaS</li>
<li>Familiarity with AI/ML products and the adoption challenges they create</li>
<li>Experience with CS platforms like Gainsight, Vitally, ChurnZero, or Totango</li>
<li>Experience with Salesforce or HubSpot CRM administration and reporting.</li>
<li>Familiarity with Snowflake, Looker, or similar BI/warehouse tools</li>
<li>Background scaling CS operations at a company growing to $100M+ in ARR</li>
</ul>
<p><strong>Additional Information</strong></p>
<p>Benefits:</p>
<p>Competitive Salary &amp; Equity 401(k) Program with Employer Matching Health, Dental, Vision and Life Insurance Short Term and Long Term Disability Commuter Benefits Autonomous Work Environment Office Setup Reimbursement Flexible Time Off (FTO) + Holidays Quarterly Team Gatherings In office Perks</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Customer Success Operations, Revenue Operations, Business Operations, SQL, Data Warehouses, CS Platforms, Gainsight, Vitally, ChurnZero, Totango, Salesforce, HubSpot, CRM Administration, Reporting, AI Tools, Claude, ChatGPT, Copilot, Consumption-Based Pricing Models, Usage-Based Pricing Models, Legal Technology, Professional Services SaaS, AI/ML Products, Adoption Challenges, Snowflake, Looker, BI/Warehouse Tools</Skills>
      <Category>Operations</Category>
      <Industry>Technology</Industry>
      <Employername>Eve</Employername>
      <Employerlogo>https://logos.yubhub.co/eve.com.png</Employerlogo>
      <Employerdescription>Eve provides AI-driven solutions for plaintiff law firms to elevate their operations, client service, and growth.</Employerdescription>
      <Employerwebsite>https://www.eve.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/Eve/acb41a18-4b86-44e7-8d29-eaf00b6b9091</Applyto>
      <Location>US</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>be821069-a7f</externalid>
      <Title>Asset Data Engineer</Title>
      <Description><![CDATA[<p>Join the Asset Data team and build the streaming data infrastructure that powers Anchorage&#39;s digital asset platform. You&#39;ll design systems that ingest real-time blockchain and market data from diverse providers, transforming raw feeds into certified, trusted data products.</p>
<p>We&#39;re creating contract-governed supply chains that let us onboard new assets and providers quickly while maintaining the low-latency, high-availability SLOs our business depends on.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Build streaming data pipelines for blockchain data (onchain transactions, staking rewards, validator info) and market data (prices, trades, order books)</li>
<li>Design and implement data contracts and validation gates that enforce quality and schema compliance at ingestion points</li>
</ul>
<p><strong>Complexity and Impact of Work:</strong></p>
<ul>
<li>Collaborate on designing the architecture for standardized ingestion patterns that enable rapid onboarding of new blockchains and market data feeds</li>
<li>Establish redundancy and failover patterns to meet Tier 1 availability and freshness SLOs for critical data products</li>
</ul>
<p><strong>Organizational Knowledge:</strong></p>
<ul>
<li>Collaborate with Protocols, Trading, and Custody teams to understand their data needs and design certified data products with clear SLAs</li>
<li>Partner with Data Platform team on orchestration, storage patterns (BigLake), and metadata management (Atlan)</li>
</ul>
<p><strong>Communication and Influence:</strong></p>
<ul>
<li>Advocate for contract-governed data supply chains and help establish engineering standards for producer patterns across the org</li>
<li>Contribute to architectural decisions and help mature the team&#39;s practices around observability, testing, and operational excellence</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>5-7+ years building streaming or high-throughput data systems: You have experience designing and operating production data pipelines that handle large volumes with low latency and high reliability</li>
<li>Solid backend engineering skills: You&#39;re proficient in Go or Python and have built services that interact with streaming infrastructure (Kafka, pub/sub, websockets, REST APIs)</li>
<li>Blockchain data familiarity: You understand blockchain concepts and are comfortable working with on-chain data (transactions, events, staking, validators) across multiple chains with different data models</li>
<li>Data engineering adjacent skills: You&#39;re comfortable with data transformation patterns, schema evolution, and working with cloud data warehouses (BigQuery) and storage systems (GCS, BigLake)</li>
<li>Operational mindset: You have experience deploying and operating services on cloud platforms (preferably GCP), with strong practices around monitoring, alerting, and incident response</li>
</ul>
<p><strong>Preferred Qualifications:</strong></p>
<ul>
<li>Staking data expertise: You&#39;ve worked with staking rewards, validator data, or proof-of-stake blockchain infrastructure</li>
<li>Market data systems: You&#39;ve built systems that ingest and process market data (prices, trades, order books) from exchanges or data vendors</li>
<li>Infrastructure as code: You have experience with Terraform, Kubernetes, and modern DevOps practices</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Go, Python, Kafka, pub/sub, websockets, REST APIs, blockchain data, data transformation patterns, schema evolution, cloud data warehouses, storage systems, stake data expertise, market data systems, infrastructure as code</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anchorage Digital</Employername>
      <Employerlogo>https://logos.yubhub.co/anchorage.co.png</Employerlogo>
      <Employerdescription>Anchorage Digital is a regulated crypto platform that provides institutions with integrated financial services and infrastructure solutions.</Employerdescription>
      <Employerwebsite>https://www.anchorage.co/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/anchorage/82139746-fb0e-44b9-bbb6-ae078e5d251a</Applyto>
      <Location>New York City</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>f160b80e-77d</externalid>
      <Title>eCommerce Data Analyst</Title>
      <Description><![CDATA[<p>We are seeking a highly analytical and impact-driven eCommerce Data Analyst to join our global eCommerce team. This role sits at the intersection of customer behavior, digital experience, and revenue performance, transforming data into actionable insights that drive growth across our websites.</p>
<p>You will analyze full-funnel customer journeys across web, marketing, and CRM touchpoints, leveraging modern analytics platforms, cloud data warehouses, and A/B testing. The ideal candidate thrives in close collaboration with marketing, webstore, product, and engineering teams.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Leverage CDP and first-party data to build customer segments, cohorts, and lifecycle views, partnering with marketing and CRM teams to support audience activation, personalization, and campaign optimization.</li>
<li>Conduct in-depth analysis of website traffic, user behavior, and eCommerce performance to identify trends, opportunities, and risks across the full customer funnel.</li>
<li>Ideate, support, and analyze A/B and multivariate experiments, defining success metrics, interpreting results with statistical rigor, and recommending next steps based on business impact.</li>
<li>Collaborate with cross-functional teams (marketing, webstore, product, and engineering) to develop and execute data-driven strategies for web and eCommerce initiatives.</li>
<li>Define and document event tracking and measurement requirements aligned with business objectives, ensuring accurate and scalable data collection.</li>
<li>Monitor and report on the effectiveness of promotional activities, marketing initiatives, and website performance, translating results into actionable insights.</li>
<li>Build and maintain automated dashboards and self-service reporting, communicating insights through clear narratives, visualizations, and executive-ready summaries that proactively surface opportunities and risks.</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>3+ years of experience in Commerce, web, or digital analytics</li>
<li>Bachelor’s degree in Data Science, Statistics, Economics, Business Information Systems, or a related field.</li>
<li>Proven experience analyzing consumer-facing digital products or eCommerce platforms.</li>
<li>Strong understanding of eCommerce KPIs, digital marketing metrics, and customer lifecycle measurement.</li>
<li>Must have hands-on experience with SQL, cloud data warehouse (e.g Snowflake), GA4, CDP (such as mParticle).</li>
<li>Experience working with large, complex, and multi-source datasets.</li>
<li>Excellent data visualization skills (Looker Studio, PowerBi) and ability to communicate complex insights effectively.</li>
<li>Strong attention to detail and ability to manage multiple projects simultaneously.</li>
</ul>
<p><strong>Nice to have:</strong></p>
<ul>
<li>Experience with AI-assisted analytics, forecasting, or anomaly detection.</li>
<li>Knowledge of product analytics, personalization, or recommendation systems.</li>
<li>Exposure to server-side tracking, event schemas, and data contracts.</li>
<li>Experience working in global or multi-region eCommerce environments.</li>
<li>AI-ecommerce experience</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$75,000—$110,000 USD</Salaryrange>
      <Skills>SQL, cloud data warehouse, GA4, CDP, data visualization, data analysis, A/B testing, multivariate experiments, AI-assisted analytics, forecasting, anomaly detection, product analytics, personalization, recommendation systems, server-side tracking, event schemas, data contracts</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Corsair</Employername>
      <Employerlogo>https://logos.yubhub.co/corsair.com.png</Employerlogo>
      <Employerdescription>Corsair is a leading manufacturer of gaming peripherals and components.</Employerdescription>
      <Employerwebsite>https://www.corsair.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://edix.fa.us2.oraclecloud.com/hcmUI/CandidateExperience/en/sites/CX_1/job/8662</Applyto>
      <Location>Milpitas, CA</Location>
      <Country></Country>
      <Postedate>2026-03-10</Postedate>
    </job>
    <job>
      <externalid>7af16166-8fd</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p><strong>Key Responsibilities:</strong></p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
<li>Mentor developers and create reference implementations/frameworks.</li>
<li>Partners with System Architects to elaborate capabilities and features.</li>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Over 6 years of experience as a senior domain architect for Data domains</li>
<li>Advanced English Level</li>
<li>Masters&#39; degree (PLUS)</li>
<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>
</ul>
<p><strong>Technical &amp; Business Skills:</strong></p>
<ul>
<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>
<li>Data Architecture / Data Modeling – Advanced (MUST)</li>
<li>Data Warehouse – Advanced (MUST)</li>
<li>Cloud Data Platforms - Advanced</li>
<li>Data Integration Tools – Advanced</li>
<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>
<li>Any Cloud - Intermediate (4-6 Years)</li>
<li>Power BI or Tableau - Intermediate (4-6 Years)</li>
<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>
<li>Data Lakehouse – Intermediate (MUST)</li>
</ul>
<ul>
<li>Data Governance - Intermediate</li>
<li>AI/ML - Entry Level (PLUS)</li>
<li>Master Data Management - Intermediate</li>
<li>Operational Data Management - Intermediate</li>
</ul>
<p><strong>Benefits:</strong></p>
<p>This position comes with a competitive compensation and benefits package.</p>
<ul>
<li>A competitive salary and performance-based bonuses.</li>
<li>Comprehensive benefits package.</li>
<li>Flexible work arrangements (remote and/or office-based).</li>
<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>
<li>Private Health Insurance.</li>
<li>Paid Time Off.</li>
<li>Training &amp; Development opportunities in partnership with renowned companies.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, AI/ML, Master Data Management, Operational Data Management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global consulting and technology services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/jdUFHSPZZjHsgd3TR4R3BS/remote-fbs-senior-data-domain-architect-in-colombia-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>2a56a653-c18</externalid>
      <Title>Palantir Engineer Specialist - Sr. Consultant - Principal</Title>
      <Description><![CDATA[<p><strong>Palantir Engineer Specialist</strong></p>
<p><strong>Sr. Consultant - Principal</strong></p>
<p><strong>London</strong></p>
<p>Do you want to boost your career and collaborate with expert, talented colleagues to solve and deliver against our clients&#39; most important challenges? We are growing and are looking for people to join our team. You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organisation allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset. Are you ready?</p>
<p><strong>About Your Role</strong></p>
<p>As a <strong>Senior Consultant / Principal Consultant – Palantir Engineer</strong>, you lead and deliver end-to-end, data-driven solutions using <strong>Palantir Foundry</strong> in complex client environments. You operate at the intersection of engineering, data, and consulting, working closely with business and technical stakeholders to translate complex problems into scalable, production-ready solutions. You combine strong hands-on technical skills with a consulting mindset, taking ownership of solution design, implementation, and adoption across organisations.</p>
<p><strong>Your role will include:</strong></p>
<ul>
<li>Own the <strong>end-to-end delivery</strong> of Palantir Foundry–based solutions, from problem definition to production</li>
<li>Design and implement <strong>data pipelines and transformations</strong> across diverse data sources</li>
<li>Model data using <strong>Foundry Ontology</strong> concepts to support analytics and operational use cases</li>
<li>Build scalable, reliable solutions using <strong>Python, SQL, and PySpark</strong> within Foundry</li>
<li>Collaborate closely with business stakeholders to define requirements, success metrics, and roadmaps</li>
<li>Support <strong>prototyping, productionisation, and scaling</strong> of data-driven applications</li>
<li>Ensure solutions meet requirements for <strong>data quality, governance, security, and performance</strong></li>
<li>Act as a technical advisor within project teams and contribute to best practices</li>
</ul>
<p><strong>Requirements</strong></p>
<p><strong>What you bring – required</strong></p>
<p><strong>Experience &amp; Seniority</strong></p>
<ul>
<li>Proven experience as a <strong>Senior Consultant or Principal Consultant</strong> in data, analytics, or platform engineering</li>
<li>Strong experience delivering <strong>client-facing data solutions</strong> in complex environments</li>
<li>Ability to take ownership and work independently in ambiguous problem spaces</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong programming skills in <strong>Python</strong> and <strong>SQL</strong>; <strong>PySpark</strong> experience required</li>
<li>Hands-on experience with <strong>Palantir Foundry</strong>, including:</li>
<li>Pipeline Builder / Code Workbook</li>
<li>Data integration and transformation</li>
<li>Ontology modelling and data lineage</li>
<li>Solid understanding of <strong>data architectures</strong>, including data lakes, lakehouses, and data warehouses</li>
<li>Experience working with APIs, databases, and structured / semi-structured data</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience building <strong>scalable ETL/ELT pipelines</strong></li>
<li>Familiarity with <strong>CI/CD concepts</strong>, testing, and production deployments</li>
<li>Strong focus on <strong>solution quality, maintainability, and performance</strong></li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field <strong>or equivalent practical experience</strong></li>
</ul>
<p><strong>Nice to have</strong></p>
<ul>
<li>Experience with <strong>cloud platforms</strong> (AWS, Azure, GCP)</li>
<li>Familiarity with <strong>containerisation</strong> (Docker, Kubernetes)</li>
<li>Prior experience as a <strong>Palantir FDE</strong> or in Foundry-heavy delivery roles</li>
<li>Domain experience in industries such as <strong>Energy, Finance, Public Sector, Healthcare, or Logistics</strong></li>
</ul>
<p><strong>Benefits</strong></p>
<p><strong>About your team</strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics strategy, Data Management &amp; Governance, Data Platforms &amp; engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognised as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognised by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, SQL, PySpark, Palantir Foundry, Pipeline Builder, Code Workbook, Data integration, Data transformation, Ontology modelling, Data lineage, Data architectures, Data lakes, Lakehouses, Data warehouses, APIs, Databases, Structured data, Semi-structured data, ETL/ELT pipelines, CI/CD concepts, Testing, Production deployments, Solution quality, Maintainability, Performance, Bachelor’s degree, Master’s degree, Computer Science, Engineering, Mathematics, Cloud platforms, Containerisation, Palantir FDE, Foundry-heavy delivery roles, Domain experience in industries such as Energy, Finance, Public Sector, Healthcare, or Logistics</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. The company is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/2A8U1ryerVijb4fFAc6i8u/hybrid-palantir-engineer-specialist---sr.-consultant---principal-in-london-at-infosys-consulting---europe</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7b03b30a-b20</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>
<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p>**Key Responsibilities:*</p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
</ul>
<ul>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
</ul>
<ul>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
</ul>
<ul>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
</ul>
<ul>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
</ul>
<ul>
<li>Mentor developers and create reference implementations/frameworks.</li>
</ul>
<ul>
<li>Partners with System Architects to elaborate capabilities and features.</li>
</ul>
<ul>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, Master Data Management, Operational Data Management, AI/ML</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/1U952YA2QBa8zK7Tm5d3Lm/remote-fbs-senior-data-domain-architect-in-mexico-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>dcfed817-412</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Senior Data Domain Architect to join our team. As a Senior Data Domain Architect, you will design and develop Data/Domain IT architecture solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Key Responsibilities:</strong></p>
<ul>
<li>Utilize in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery</li>
<li>Solve complex problems and partner effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization</li>
<li>Work independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge</li>
<li>Review high level design to ensure alignment to Solution Architecture</li>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives</li>
<li>Mentor developers and create reference implementations/frameworks</li>
<li>Partner with System Architects to elaborate capabilities and features</li>
<li>Deliver single domain architecture solutions and execute continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Over 6 years of experience as a senior domain architect for Data domains</li>
<li>Advanced English Level</li>
<li>Masters&#39; degree (PLUS)</li>
<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>
</ul>
<p><strong>Technical &amp; Business Skills:</strong></p>
<ul>
<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>
<li>Data Architecture / Data Modeling – Advanced (MUST)</li>
<li>Data Warehouse – Advanced (MUST)</li>
<li>Cloud Data Platforms - Advanced</li>
<li>Data Integration Tools – Advanced</li>
<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>
<li>Any Cloud - Intermediate (4-6 Years)</li>
<li>Power BI or Tableau - Intermediate (4-6 Years)</li>
<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>
<li>Data Lakehouse – Intermediate (MUST)</li>
</ul>
<p><strong>Benefits:</strong></p>
<ul>
<li>A competitive salary and performance-based bonuses</li>
<li>Comprehensive benefits package</li>
<li>Flexible work arrangements (remote and/or office-based)</li>
<li>Private Health Insurance</li>
<li>Paid Time Off</li>
<li>Training &amp; Development opportunities in partnership with renowned companies</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/x7tKXYFBB815ca6oBV5T2E/remote-fbs-senior-data-domain-architect-in-brazil-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>ee2fcbdc-fc4</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will act as a senior technical leader in complex data and analytics engagements, shaping and governing end-to-end enterprise data architectures, leading technical teams, and serving as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will be responsible for ensuring that enterprise data and analytics solutions are scalable, secure, and production-ready, while translating business requirements into robust technical designs and delivery roadmaps.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, Azure, AWS or GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, Postgres, SQL Server, Oracle, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, Docker / Kubernetes, Advanced analytics, AI / ML or GenAI, Streaming platforms (e.g. Kafka, Azure Event Hubs), Data governance or metadata tools, Cloud, data, or architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. The company is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/uuSzzCt8qNbo6UpEFkSyjY/hybrid-principal-consultant---data-architecture-in-london-at-infosys-consulting---europe</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>56dc9a51-e66</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will act as a senior technical leader in complex data and analytics engagements. You will shape and govern end-to-end enterprise data architectures, lead technical teams, and serve as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>You will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>enterprise data architecture, system data integration, data engineering, analytics, modern data architectures, Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, cloud data platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, SQL, relational databases, Postgres, SQL Server, Oracle, NoSQL databases, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, data migration programmes, data pipelines, orchestration, automation, CI/CD concepts, production-grade deployments, distributed systems, Docker, Kubernetes, data management and governance principles, data quality, metadata, lineage, master data management, data management software and tools, security, access control, compliance considerations, Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience, advanced analytics, AI / ML or GenAI, streaming platforms, Kafka, Azure Event Hubs, data governance or metadata tools, cloud, data, architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. It is a mid-size player with a supportive, entrepreneurial spirit.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/hpBWjvvy8D6B1f818cHxZR/remote-principal-consultant---data-architecture-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>fbb19758-f83</externalid>
      <Title>Principal Consultant Data Architecture (m/w/d)</Title>
      <Description><![CDATA[<p>Are you looking to advance your career and work with experienced, talented colleagues to successfully solve the most significant challenges of our clients? We are growing further and seeking engaged individuals to strengthen our team. You will be part of a dynamic, strongly growing company with over 300,000 employees.</p>
<p>Our dynamic organisation allows you to work across themes and bring in your ideas, experiences, creativity, and goal orientation. Are you ready?</p>
<p>As a Principal Consultant Data Architecture, you will be the technical leader in complex data and analytics projects. You will design and be responsible for comprehensive enterprise data architectures, lead technical teams, and be a trusted technical advisor for customers and internal stakeholders.</p>
<p>You will ensure that enterprise data and analytics solutions are scalable, secure, and operational, translate technical requirements into robust technical images, and plan the introduction.</p>
<p><strong>Your Tasks:</strong></p>
<ul>
<li>Definition and governance of target architectures for enterprise data, integration, and analytics in cloud and hybrid environments</li>
<li>Translation of business goals into scalable, secure, and compliant architectures</li>
<li>Leadership of the conception of comprehensive end-to-end data solutions (data intake, data integration, storage, security, processing, analytics, AI support)</li>
<li>Steering and accompanying delivery teams during implementation, rollout, and establishment of operational readiness</li>
<li>Senior technical contact person for architects, IT managers, and technical teams of customers</li>
<li>Mentoring of system and data architects as well as programmers</li>
<li>Participation in the further development of best practices and reference architectures</li>
<li>Support of presales and solution design activities from a technical perspective</li>
</ul>
<p><strong>What You Bring - Minimum Requirements</strong></p>
<p><strong>Experience &amp; Seniority</strong></p>
<ul>
<li>At least 5 years of relevant professional experience in enterprise data architecture, data integration, data engineering, or analytics</li>
<li>Experience in leading enterprise data architecture workstreams or technical teams</li>
<li>Strong customer and advisory experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>In-depth expertise in modern data architectures, particularly:</li>
</ul>
<ol>
<li>Data Mesh / Data Fabric / Data Lake / Data Warehouse Architectures</li>
<li>Principles of modern data architecture designs</li>
<li>Integration patterns for batch and streaming data</li>
<li>Data platform, DevOps, deployment, and security architectures</li>
<li>Analytics and AI enablement architectures</li>
</ol>
<ul>
<li>Practical experience with cloud data platforms, such as:</li>
</ul>
<ol>
<li>Azure, AWS, or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
</ol>
<ul>
<li>Very good SQL knowledge as well as experience with relational databases (e.g. PostgreSQL, SQL-Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Good understanding of API-based and event-driven architectures</li>
<li>Experience in conceiving and steering enterprise data migration programs (including mapping, transformation rules, data quality measures, etc.)</li>
</ul>
<p><strong>Engineering &amp; Platform Fundamentals</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Knowledge of CI/CD concepts and production-ready deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes knowledge is an advantage</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Very good understanding of data management and governance principles, particularly:</li>
</ul>
<ol>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data management software and tools</li>
<li>Security, access, and compliance requirements</li>
</ol>
<ul>
<li>Bachelor&#39;s or master&#39;s degree in computer science, engineering, mathematics, or a related field, or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Experience with advanced analytics, AI/ML, or GenAI from an architect&#39;s perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Practical experience with data governance or metadata tools</li>
<li>Cloud or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility (Germany)</strong></p>
<ul>
<li>Fluent German skills (at least C1) for customer communication in the German-speaking market</li>
<li>Very good English skills</li>
<li>Project-related travel readiness</li>
</ul>
<p><strong>About Your Team</strong></p>
<p>You will become part of our growing data and analytics teams. In this area, you will work with modern technologies in modern data ecosystems. You have the opportunity to turn your own ideas into results - in the areas of data and analytics strategy, data management and governance, data platforms and engineering, as well as analytics and data science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>You will become an employee of a globally renowned management consulting firm that is at the forefront of industry disruption. We work across industries with leading companies. Our culture is inclusive and entrepreneurial. As a mid-sized consulting firm embedded in the size of Infosys, we can support our customers worldwide and throughout the entire transformation process in a partnership-like manner.</p>
<p>Our values IC-LIFE - Inclusion, Equity &amp; Diversity, Client, Leadership, Integrity, Fairness, and Excellence - form our compass of values. Further information can be found on our career website.</p>
<p>In Europe, we are awarded by the Financial Times and Forbes as one of the leading consulting firms. Infosys is one of the top employers in Germany 2023 and has been certified by the Top Employers Institute for outstanding working conditions in Europe for five years in a row.</p>
<p>We offer a market-leading remuneration, attractive additional benefits, as well as excellent further education and development opportunities. Have you become curious? Then we look forward to your application</p>
<p>More about Infosys Consulting - Europe</p>
<p><strong>Visit website</strong></p>
<p>Where Innovation meets Excellence.</p>
<p>Infosys Consulting is a globally renowned management consulting firm that is on the front-line of industry disruption. We are a mid-size player with a supportive, entrepreneurial spirit that works with a market-leading brand in every sector, while our parent organization Infosys is a top-5 powerhouse IT brand that is outperforming the market and experiencing rapid growth.</p>
<p>Our consulting business is annually recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths we offer to our consultants. We are committed to fostering an inclusive work culture that inspires everyone to deliver their best.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Mesh, Data Fabric, Data Lake, Data Warehouse Architectures, Principles of modern data architecture designs, Integration patterns for batch and streaming data, Data platform, DevOps, deployment, and security architectures, Analytics and AI enablement architectures, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, PostgreSQL, SQL-Server, Oracle, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, Enterprise data migration programs</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting is a globally renowned management consulting firm that works with a market-leading brand in every sector, while its parent organization Infosys is a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/sve4gTuNFLf3RtEjhQMzHp/remote-principal-consultant-data-architecture-(m%2Fw%2Fd)--deutschlandweit-in-munich-at-infosys-consulting---europe</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>9475bb73-df7</externalid>
      <Title>Product Owner, Enrichment</Title>
      <Description><![CDATA[<p><strong>About Anthropic</strong></p>
<p>Anthropic&#39;s mission is to create reliable, interpretable, and steerable AI systems. We want AI to be safe and beneficial for our users and for society as a whole.</p>
<p><strong>About the role</strong></p>
<p>We are looking for a Product Owner, Enrichment to own and drive the strategy, architecture, and execution of our data enrichment ecosystem. This role sits at the intersection of Revenue Operations, Data Engineering, and Go-to-Market strategy, and is responsible for building and maintaining a best-in-class enrichment infrastructure that delivers a reliable, comprehensive source of truth for company and contact data across global markets.</p>
<p>You will be the subject matter expert and product owner for all enrichment tools, data sources, and processes—including platforms like Clay, Dun &amp; Bradstreet, ZoomInfo, and other third-party providers. You will design and operate the systems that power account hierarchies, firmographic enrichment, contact discovery, and signal detection, ensuring our GTM teams have the accurate, complete data they need to identify, prioritise, and close business.</p>
<p>This is a hands-on, technically-oriented role that requires deep experience working with large datasets, complex system integrations, and Salesforce data modelling. You will collaborate closely with Sales, Marketing, Data Science, Data Engineering, and Revenue Operations to ensure our enrichment strategy supports both near-term GTM execution and long-term data infrastructure goals.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Own the end-to-end enrichment strategy and roadmap, serving as the product owner for all enrichment tools, vendors, and data sources including Clay, Dun &amp; Bradstreet, ZoomInfo, and emerging providers</li>
</ul>
<ul>
<li>Build and maintain a unified enrichment master—a reliable source of truth for company and person data including parent-child account hierarchies, firmographics, technographics, and contact intelligence across domestic and international markets</li>
</ul>
<ul>
<li>Design and implement waterfall enrichment workflows that orchestrate multiple data providers to maximise coverage, accuracy, and cost efficiency while minimising redundancy</li>
</ul>
<ul>
<li>Architect enrichment data models within Salesforce, making strategic decisions about how enrichment data is stored, related, and surfaced (e.g., custom objects vs. direct field integration, parent account structures, enrichment audit trails)</li>
</ul>
<ul>
<li>Hands-on data manipulation and transformation—write queries, build data pipelines, and work directly with data warehouses (e.g., Snowflake, BigQuery) to clean, transform, match, and deduplicate enrichment data at scale</li>
</ul>
<ul>
<li>Lead international enrichment strategy, addressing the unique challenges of enriching company and contact data across global markets with varying data availability, provider coverage, and regulatory requirements</li>
</ul>
<ul>
<li>Partner with Data Science and Data Engineering to define enrichment schemas, resolve entity matching challenges, and build scalable infrastructure that supports both real-time and batch enrichment processes</li>
</ul>
<ul>
<li>Collaborate with Sales, Marketing, and Revenue Operations to understand GTM data needs, translate business requirements into enrichment solutions, and ensure enrichment outputs directly support pipeline generation, territory planning, lead routing, and account scoring</li>
</ul>
<ul>
<li>Define and track enrichment KPIs including match rates, data completeness, freshness, accuracy, and downstream GTM impact—using metrics to continuously improve the enrichment ecosystem</li>
</ul>
<ul>
<li>Evaluate and onboard new enrichment vendors and data sources, conducting proof-of-concept testing and negotiating contracts in partnership with procurement</li>
</ul>
<ul>
<li>Explore and implement AI-powered enrichment capabilities, including prompt-based enrichment using LLMs to supplement traditional data providers for emerging companies, startups, and hard-to-enrich segments</li>
</ul>
<p><strong>You may be a good fit if you have:</strong></p>
<ul>
<li>10+ years of experience in data enrichment, data operations, or revenue/marketing operations with hands-on ownership of enrichment tools and strategy in a B2B SaaS or enterprise technology environment</li>
</ul>
<ul>
<li>Deep expertise with enrichment platforms such as Clay, Dun &amp; Bradstreet (D-U-N-S, Data Blocks, hierarchies), ZoomInfo, Clearbit, People Data Labs, or comparable providers, including experience building waterfall enrichment workflows and enrichment masters</li>
</ul>
<ul>
<li>Strong Salesforce experience (required)—including data modelling for enrichment (custom objects, account hierarchies, parent-child relationships), integration architecture, and understanding of how enrichment data flows through the CRM to support GTM processes</li>
</ul>
<ul>
<li>Hands-on technical skills for data manipulation including SQL proficiency, experience with data warehouses (Snowflake, BigQuery, or similar), and comfort working with ETL/reverse ETL pipelines, APIs, and data transformation tools</li>
</ul>
<ul>
<li>Strong product ownership mindset with experience managing roadmaps, backlogs, and stakeholder priorities—able to translate business needs into technical requirements and drive execution across cross-functional teams</li>
</ul>
<ul>
<li>Dual data + RevOps mindset—equally comfortable working with Data Science and Data Engineering on infrastructure and schema design as you are partnering with Sales and GTM teams on pipeline and territory optimisation</li>
</ul>
<ul>
<li>Excellent communication skills to bridge technical and business audiences, lead stakeholder discovery sessions, and present enrichment strategy and impact to leadership</li>
</ul>
<p><strong>Strong candidates may have:</strong></p>
<ul>
<li>Experience building or leveraging AI-powered enrichment prompts (e.g., using LLMs to research and enrich company data, identify signals, or fill gaps where traditional providers lack coverage)</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data enrichment, data operations, revenue/marketing operations, enrichment tools, data sources, platforms like Clay, Dun &amp; Bradstreet, ZoomInfo, Salesforce, data modelling, integration architecture, SQL, data warehouses, ETL/reverse ETL pipelines, APIs, data transformation tools, product ownership, roadmaps, backlogs, stakeholder priorities, data science, data engineering, infrastructure, schema design, communication, technical and business audiences, AI-powered enrichment, LLMs, prompt-based enrichment, emerging companies, startups, hard-to-enrich segments</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a quickly growing organisation that aims to create reliable, interpretable, and steerable AI systems. It has a team of researchers, engineers, policy experts, and business leaders working together to build beneficial AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5127289008</Applyto>
      <Location>San Francisco, CA | New York City, NY</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>39a574a0-94c</externalid>
      <Title>Technical Program Manager, Marketing Technology</Title>
      <Description><![CDATA[<p>As a Technical Program Manager for Marketing Technology, you will lead our Marketing Mix Modeling (MMM), incrementality testing, brand measurement, and marketing data infrastructure programs. You&#39;ll orchestrate complex, cross-functional initiatives spanning vendor partnerships, data infrastructure, experimentation design, and stakeholder alignment to build world-class marketing measurement capabilities for Anthropic&#39;s growth.</p>
<p>You&#39;ll serve as the central coordinator between Data Science, Growth Marketing, Brand Marketing, Product, Engineering, Data Infrastructure, Privacy, Finance, and external partners including MMM vendors, media platform partners, and agencies. This role requires someone who can navigate technical complexity, drive alignment across diverse stakeholders, and translate between business strategy and technical execution across both performance and brand measurement.</p>
<p>As the program lead for our measurement infrastructure, you&#39;ll be responsible for delivering our MMM proof-of-concept, establishing ongoing experimentation frameworks, designing and executing brand lift studies, leading the strategic assessment and migration of infrastructure, and building the operational foundations that enable data-driven marketing investment decisions at scale.</p>
<p>Responsibilities:</p>
<ul>
<li><strong>Marketing Measurement Intelligence</strong>: Lead end-to-end program management for MMM proof-of-concept execution, and transition to production operations. Design and execute comprehensive incrementality testing programs including geo-based experiments, conversion lift studies, and in-platform tests with media partners to calibrate and validate MMM outputs. Lead brand lift study design and execution across media platforms to measure awareness, consideration, favorability, and intent. Synthesize measurement results across MMM, brand lift, and incrementality testing for holistic marketing effectiveness views, building reporting frameworks that connect brand health metrics to business outcomes.</li>
<li><strong>MarTech Infrastructure &amp; Vendor Management:</strong> Support strategic assessments of marketing technology platforms, facilitating cross-functional evaluation and driving stakeholder alignment on build-vs-buy decisions while mapping dependencies and identifying blockers. Serve as key contact for vendors and agencies, managing relationships, business reviews, and coordinating execution with implementation roadmaps. Establish operational excellence standards including monitoring, alerting, version control, automated privacy validation, and incident response protocols while maintaining executive visibility into platform initiatives and working with Legal and Security on vendor reviews.</li>
<li><strong>Marketing Workflow Automation</strong>: Partner with Marketing leadership to identify, prioritize, and support deployment of AI-powered automation solutions for marketing operations. Establish governance frameworks, quality standards, validation processes, and monitoring mechanisms for automated marketing workflows. Build sustainable operating models for ongoing automation maintenance and continuous improvement. Track and measure automation impact to demonstrate ROI to leadership and cross-functional teams. Act as a center of excellence to socialize successful automation stories within Marketing to the broader Anthropic organisation.</li>
</ul>
<p>You May Be a Good Fit If You:</p>
<ul>
<li>Have 7+ years of technical program management experience, with 3+ years in marketing measurement, analytics infrastructure, or data science programs</li>
<li>Have a track record of successfully managing complex programs involving data science, marketing operations, engineering, agencies, and vendors</li>
<li>Possess deep understanding of MMM, attribution, incrementality testing, brand lift studies, and experimentation design</li>
<li>Have strong technical fluency with customer data platforms, marketing data sources, data warehouses, and analytics platforms</li>
<li>Have experience evaluating and migrating between marketing technology platforms or data infrastructure systems</li>
<li>Can engage with data scientists on regression analysis, causality, adstock modeling, and experimental design</li>
<li>Understand CDP architecture including event collection, tag management, streaming delivery, reverse ETL, and privacy compliance</li>
<li>Have a track record of delivering 0-to-1 programs on aggressive timelines with high visibility</li>
<li>Excel at translating technical concepts for varied audiences and can influence without authority</li>
<li>Thrive in ambiguous situations, bringing structure to complex challenges with competing priorities and limited resources</li>
<li>Have excellent written and verbal communication skills with executive presence and strong presentation abilities</li>
<li>Are passionate about Anthropic&#39;s mission and interested in the challenges of bringing frontier AI capabilities to market</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$290,000 - $365,000 USD</Salaryrange>
      <Skills>Marketing Mix Modeling, Incrementality testing, Brand measurement, Marketing data infrastructure, Customer data platforms, Marketing data sources, Data warehouses, Analytics platforms, CDP architecture, Event collection, Tag management, Streaming delivery, Reverse ETL, Privacy compliance, Regression analysis, Causality, Adstock modeling, Experimental design, Data science, Marketing operations, Engineering, Agencies, Vendor management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a quickly growing organisation that aims to create reliable, interpretable, and steerable AI systems. The company has a team of researchers, engineers, policy experts, and business leaders working together to build beneficial AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5108854008</Applyto>
      <Location>San Francisco, CA | New York City, NY</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>6acb1ca2-f64</externalid>
      <Title>Support Operations Analyst</Title>
      <Description><![CDATA[<p>As a Support Operations Analyst, you will build the analytical and workforce planning foundation that enables Anthropic&#39;s support organisation to scale intelligently. This role sits at the intersection of data analysis, capacity planning, and operational strategy—providing the insights leadership needs to make confident decisions about staffing, investment, and service levels.</p>
<p>You&#39;ll own forecasting and capacity planning across our support organisation, including FTE teams, AI-powered support channels, and vendor/contractor partnerships. This means building models that predict volume based on product launches, model releases, and customer growth; analysing the relationship between support metrics and business outcomes; and ensuring we have the right resources in the right places to meet our service commitments.</p>
<p>This is a high-ambiguity role where you&#39;ll often be building from scratch. We&#39;re looking for someone who can create structure where none exists, ask the right questions to scope problems, and translate messy data into narratives that drive action.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Build and maintain staffing models that translate SLA targets into headcount requirements across FTE and vendor teams</li>
<li>Forecast support volume by analysing historical trends, product release calendars, model launches, and customer base growth projections</li>
<li>Factor AI support effectiveness (automation rates, deflection, Fin AI Agent performance) into capacity models to ensure accurate human staffing projections</li>
<li>Partner with vendor managers to align contractor capacity with demand forecasts and service level requirements</li>
<li>Model scenarios to inform strategic decisions about staffing investments, vendor mix, and coverage models</li>
<li>Develop frameworks for prioritising automation initiatives based on volume impact and deflection potential</li>
</ul>
<p><strong>Analytics &amp; Reporting:</strong></p>
<ul>
<li>Maintain and enhance dashboards that track productivity, response times, CSAT, queue health, and other key support metrics</li>
<li>Investigate the relationship between support performance and business outcomes (e.g., how response time and satisfaction impact retention and churn)</li>
<li>Surface trends and insights that inform operational decisions—identifying what&#39;s driving volume, where bottlenecks emerge, and where investment is needed</li>
<li>Translate complex data into clear recommendations for leadership and cross-functional partners</li>
</ul>
<p><strong>Operational Partnership:</strong></p>
<ul>
<li>Collaborate with Support Ops, AI Support, and Human Support teams to ensure data and forecasts align with operational reality</li>
<li>Partner with Finance on headcount planning, budget alignment, and quarterly capacity reviews</li>
<li>Work with Product and Engineering to anticipate how launches and feature changes will impact support demand</li>
<li>Contribute to vendor performance management by establishing metrics frameworks and reporting cadences</li>
</ul>
<p><strong>You might be a good fit if you:</strong></p>
<ul>
<li>Have 4+ years of experience in workforce management, support operations analytics, business analytics, or similar roles—ideally in a support or customer success context</li>
<li>Are deeply analytical with strong SQL skills and experience with data warehouses (e.g., BigQuery) and analysis tools like Hex, Looker, or similar</li>
<li>Have hands-on experience with forecasting and capacity planning, including modelling staffing needs against service level targets</li>
<li>Are comfortable working with ambiguity—you can take a vague question, scope it into an answerable problem, and deliver insights that drive decisions</li>
<li>Understand support operations metrics (SLAs, handle time, CSAT, deflection rates) and can connect them to business impact</li>
<li>Have experience working with BPO or vendor partners on staffing, performance, and capacity alignment</li>
<li>Communicate clearly—you can translate technical analysis into narratives that resonate with both operational teams and executives</li>
<li>Are curious about AI and excited to work in an environment where the product and support landscape evolve rapidly</li>
<li>Thrive in fast-paced environments and can balance building foundational infrastructure with responding to urgent business questions</li>
<li>Have experience with workforce management platforms (e.g., Assembled, NICE, Calabrio) is a plus, but not required</li>
</ul>
<p>The annual compensation range for this role is $131,040 - $165,000 USD.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$131,040 - $165,000 USD</Salaryrange>
      <Skills>SQL, data warehouses, analysis tools, forecasting, capacity planning, workforce management, support operations analytics, business analytics, Hex, Looker, BigQuery, Assembled, NICE, Calabrio</Skills>
      <Category>Operations</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a rapidly growing organisation that aims to create reliable, interpretable, and steerable AI systems. The company has a team of researchers, engineers, policy experts, and business leaders working together to build beneficial AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5080931008</Applyto>
      <Location>San Francisco, CA | New York City, NY | Seattle, WA</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>a8eb2e15-0bb</externalid>
      <Title>Senior Business Systems Analyst, Finance Systems</Title>
      <Description><![CDATA[<p><strong>About the role</strong></p>
<p>We are seeking an experienced Senior Business Systems Analyst to join our Finance Systems team at Anthropic. In this role, you will serve as the internal functional lead for our Workday Financials implementation, owning the design and configuration of the Financial Data Model (FDM), Chart of Accounts, and dimensional structures that will serve as the source of truth for financial reporting.</p>
<p><strong>In this role, you will:</strong></p>
<ul>
<li><strong>ERP Core Financials Implementation:</strong> Serve as internal functional lead for Workday Financials implementation, partnering with consultants to drive configuration decisions, validate designs, and ensure business requirements are met</li>
</ul>
<ul>
<li><strong>Financial Data Model (FDM) Design:</strong> Own the design and configuration of Chart of Accounts, Worktags, dimensional hierarchies, and Accounting Books that will serve as the source of truth for all financial reporting, ensuring support for both GAAP and Management reporting requirements</li>
</ul>
<ul>
<li><strong>Prism Analytics Development:</strong> Develop and maintain Prism/Accounting Center solutions from source analysis and ingestion design through build, testing, cutover, and hypercare, including integration with external data sources like BigQuery and Pigment</li>
</ul>
<ul>
<li><strong>Requirements Gathering &amp; Reporting:</strong> Gather business requirements from Finance, Accounting, and FP&amp;A stakeholders, translating them into hands-on development of executive reporting, dashboards, and analytics solutions</li>
</ul>
<ul>
<li><strong>Workshop Participation &amp; Solution Design:</strong> Participate in implementation workshops, challenge requirements, and translate business needs into buildable designs and testable acceptance criteria; manage defects and data quality issues throughout the project lifecycle</li>
</ul>
<ul>
<li><strong>Cross-Functional Collaboration:</strong> Collaborate with Integrations, Security, and Financials configuration teams to align master data, journals, controls, and performance service level agreements; partner with Data Infrastructure and BizTech teams on system integrations</li>
</ul>
<ul>
<li><strong>Cutover &amp; Hypercare Planning:</strong> Prepare cutover plans, data migration strategies, reconciliation frameworks, and hypercare plans; document data lineage, controls, and audit artifacts to support SOX compliance requirements</li>
</ul>
<ul>
<li><strong>Platform Expansion &amp; Adoption:</strong> Work closely with engineering teams and business stakeholders to drive ongoing expansion and adoption of the Workday platform, identifying opportunities for process improvement and automation</li>
</ul>
<p><strong>You may be a good fit if you:</strong></p>
<ul>
<li>Have 8+ years of experience in finance systems, ERP implementation, or business systems analysis roles, with at least 5 years of hands-on Workday Financials experience</li>
</ul>
<ul>
<li>Possess deep expertise in Workday Financial Data Model (FDM), including Chart of Accounts design, Worktags configuration, dimensional hierarchies, and Accounting Books setup</li>
</ul>
<ul>
<li>Have strong experience with Workday Prism Analytics, including data modeling, source integration, calculated fields, and report development</li>
</ul>
<ul>
<li>Are skilled at translating complex business requirements into technical solutions, bridging the gap between finance stakeholders and technical implementation teams</li>
</ul>
<ul>
<li>Have experience with full ERP implementation lifecycles, including requirements gathering, configuration, testing, data migration, cutover planning, and hypercare</li>
</ul>
<ul>
<li>Possess strong understanding of financial accounting processes including General Ledger, multi-entity consolidation, intercompany accounting, and management reporting</li>
</ul>
<ul>
<li>Have excellent stakeholder management and communication skills, with ability to work effectively with finance leadership, accounting teams, and technical partners</li>
</ul>
<ul>
<li>Demonstrate strong analytical and problem-solving skills with attention to detail and commitment to data accuracy and integrity</li>
</ul>
<ul>
<li>Are comfortable working in fast-paced, high-growth environments with evolving requirements and tight timelines</li>
</ul>
<p><strong>Strong candidates may also have:</strong></p>
<ul>
<li>Background in accounting, finance, or CPA certification with understanding of GAAP/IFRS reporting requirements</li>
</ul>
<ul>
<li>Experience with Workday Accounting Center for complex journal automation and subledger accounting</li>
</ul>
<ul>
<li>Technical proficiency with SQL, Python, or scripting languages for data analysis and integration support</li>
</ul>
<ul>
<li>Experience integrating Workday with external data platforms such as BigQuery or cloud data warehouses</li>
</ul>
<ul>
<li>Knowledge of SOX compliance requirements and internal controls for financial systems</li>
</ul>
<ul>
<li>Experience with EPM/FP&amp;A systems such as Pigment, Anaplan, or Adaptive Planning and their integration with ERP</li>
</ul>
<ul>
<li>Prior experience at high-growth technology companies scaling toward IPO readiness</li>
</ul>
<ul>
<li>Familiarity with Workday HCM and understanding of HCM-Financials integration points</li>
</ul>
<ul>
<li>Experience with data migration tools, ETL processes, and reconciliation frameworks for ERP implementations</li>
</ul>
<p>The annual compensation range for this role is $list</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>list</Salaryrange>
      <Skills>Workday Financials, Financial Data Model (FDM), Chart of Accounts, Worktags, Dimensional Hierarchies, Accounting Books, Prism Analytics, Data Modeling, Source Integration, Calculated Fields, Report Development, ERP Implementation, Requirements Gathering, Configuration, Testing, Data Migration, Cutover Planning, Hypercare, Financial Accounting, General Ledger, Multi-Entity Consolidation, Intercompany Accounting, Management Reporting, Stakeholder Management, Communication, Analytical Skills, Problem-Solving Skills, Data Accuracy, Integrity, Workday Accounting Center, SQL, Python, Scripting Languages, BigQuery, Cloud Data Warehouses, SOX Compliance, Internal Controls, EPM/FP&amp;A Systems, Pigment, Anaplan, Adaptive Planning, ERP Integration, High-Growth Technology Companies, IPO Readiness, Workday HCM, HCM-Financials Integration, Data Migration Tools, ETL Processes, Reconciliation Frameworks</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a quickly growing organisation with a mission to create reliable, interpretable, and steerable AI systems. The company is working towards public company readiness.</Employerdescription>
      <Employerwebsite>https://job-boards.greenhouse.io</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/4991194008</Applyto>
      <Location>San Francisco, CA | Seattle, WA</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>91ae81f0-b2b</externalid>
      <Title>Data Engineer II</Title>
      <Description><![CDATA[<p>As a Data Engineer, you will be involved in the entire development lifecycle, from brainstorming ideas to implementing scalable solutions that unlock data insights. You will collaborate with stakeholders to gather requirements, design data models, and build pipelines that support reporting, analytics, and exploratory analysis.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Design, build, and sustain efficient, scalable and performant Data Engineering Pipelines to ingest, sanitize, transform (ETL/ELT), and deliver high-volume, high-velocity data from diverse sources.</li>
<li>Ensure reliable and consistent processing of versatile workloads of granularity such as Real Time, Near Real Time, Mini-batch, Batch and On-demand.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>Strong Proficiency in writing and analyzing complex SQL, Python or any 4GL.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Python, Data Engineering, Cloud Data Warehouses, Distributed data processing frameworks, Real-time/streaming data technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Data-Engineer-II/212291</Applyto>
      <Location>Hyderabad</Location>
      <Country></Country>
      <Postedate>2026-02-04</Postedate>
    </job>
    <job>
      <externalid>779ffd11-5cf</externalid>
      <Title>Data Engineer II</Title>
      <Description><![CDATA[<p>As a Data Engineer, you will be involved in the entire development lifecycle, from brainstorming ideas to implementing scalable solutions that unlock data insights. You will collaborate with stakeholders to gather requirements, design data models, and build pipelines that support reporting, analytics, and exploratory analysis.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Design, build, and sustain efficient, scalable and performant Data Engineering Pipelines to ingest, sanitize, transform (ETL/ELT), and deliver high-volume, high-velocity data from diverse sources.</li>
<li>Ensure reliable and consistent processing of versatile workloads of granularity such as Real Time, Near Real Time, Mini-batch, Batch and On-demand.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>Strong Proficiency in writing and analyzing complex SQL, Python or any 4GL.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Python, Data Engineering, Cloud Data Warehouses, Distributed data processing frameworks, Real-time/streaming data technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Data-Engineer-II/212287</Applyto>
      <Location>Hyderabad</Location>
      <Country></Country>
      <Postedate>2026-02-04</Postedate>
    </job>
    <job>
      <externalid>d91f2ddd-f1b</externalid>
      <Title>Data Engineer II</Title>
      <Description><![CDATA[<p>As a Data Engineer, you will be involved in the entire development lifecycle, from brainstorming ideas to implementing scalable solutions that unlock data insights. You will collaborate with stakeholders to gather requirements, design data models, and build pipelines that support reporting, analytics, and exploratory analysis.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Design, build, and sustain efficient, scalable and performant Data Engineering Pipelines to ingest, sanitize, transform (ETL/ELT), and deliver high-volume, high-velocity data from diverse sources.</li>
<li>Ensure reliable and consistent processing of versatile workloads of granularity such as Real Time, Near Real Time, Mini-batch, Batch and On-demand.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>Strong Proficiency in writing and analyzing complex SQL, Python or any 4GL.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Python, Data Engineering, Cloud Data Warehouses, Distributed data processing frameworks, Real-time/streaming data technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Data-Engineer-II/212288</Applyto>
      <Location>Hyderabad</Location>
      <Country></Country>
      <Postedate>2026-02-04</Postedate>
    </job>
  </jobs>
</source>