{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/snowflake"},"x-facet":{"type":"skill","slug":"snowflake","display":"Snowflake","count":100},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5c70414d-4e6"},"title":"Full‑Stack data engineer","description":"<p>We are seeking a highly self-sufficient, motivated engineer with strong full-stack data engineering skills to join our team. This is a remote/offshore role that requires autonomy, excellent communication, and the ability to deliver high-quality work with limited supervision while collaborating with a predominantly US-based team.</p>\n<p>You will build reliable, scalable data products and user experiences that power AI/ML modeling, agentic workflows, and reporting,working end-to-end from data ingestion and transformation through to UI. Our Python-based data platform is undergoing a major evolution toward a modern, cloud-native ELT architecture. We are standardizing on Snowflake as our central data platform and dbt as our core transformation framework, implementing scalable, maintainable ELT practices that simplify ingestion, modeling, and deployment.</p>\n<p>This role will be pivotal in independently designing and building robust data pipelines and semantic layers that directly power our AI and machine learning initiatives,delivering clean, reliable, and well-modeled data assets to our data science team for feature engineering, model training, and production inference. You will collaborate closely (primarily via remote channels) with data scientists and ML engineers to ensure our data ecosystem is optimized for experimentation speed, model performance, and seamless integration into downstream products and services.</p>\n<p>Key Responsibilities</p>\n<ul>\n<li>Remote collaboration &amp; communication: Operate effectively as an offshore member of a distributed team, proactively communicating status, risks, and blockers across time zones and coordinating overlap with US working hours as needed.</li>\n</ul>\n<ul>\n<li>Full-stack data engineering: Build across the entire stack, including data ingestion/acquisition and transformation, APIs, front-end components, and automated test suites, delivering production-grade solutions with minimal hand-holding.</li>\n</ul>\n<ul>\n<li>Autonomous delivery &amp; ownership: Take end-to-end ownership of features and projects,clarifying requirements, breaking work into milestones, estimating timelines, and delivering high-quality, well-documented solutions.</li>\n</ul>\n<ul>\n<li>Specification and design: Translate short- and long-term business requirements, architectural considerations, and competing timelines into clear, actionable technical specifications and design documents.</li>\n</ul>\n<ul>\n<li>Code quality: Write clean, maintainable, efficient code that adheres to evolving standards and quality processes, including unit tests and isolated integration tests in containerized environments.</li>\n</ul>\n<ul>\n<li>Continuous improvement: Contribute to agile practices and provide input on technical strategy, architectural decisions, and process improvements, continuously suggesting better tools, patterns, and automation.</li>\n</ul>\n<p>Required Skills &amp; Experience</p>\n<ul>\n<li>Professional experience: 5+ years in software engineering, with a full-stack background building complex, scalable data-engineering pipelines using data warehouse technology, SQL with dbt, Python, AWS with Terraform, and modern UI technologies.</li>\n</ul>\n<ul>\n<li>Modern data engineering: Strong experience with medallion data architecture patterns using data warehouse technologies (e.g., Snowflake), data transformation tooling (e.g., dbt), BI tooling, and NoSQL data marts (e.g., Elasticsearch/OpenSearch).</li>\n</ul>\n<ul>\n<li>Testing and QA: Solid understanding of unit testing, CI/CD automation, and quality assurance processes for both data pipeline testing and operational data quality tests.</li>\n</ul>\n<ul>\n<li>Remote work &amp; autonomy: Proven track record working in a remote or distributed environment, demonstrating self-motivation, reliable execution, and the ability to make sound technical decisions independently.</li>\n</ul>\n<ul>\n<li>Agile methodology: Working knowledge of Agile development practices and workflows (e.g., sprint planning, stand-ups, retrospectives) in a distributed team setting.</li>\n</ul>\n<ul>\n<li>Education: Bachelor’s or Master’s degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.</li>\n</ul>\n<p>Preferred Skills &amp; Experience</p>\n<ul>\n<li>Machine learning and AI: Hands-on experience with large language models (LLMs) and agentic frameworks/workflows.</li>\n</ul>\n<ul>\n<li>Search and analytics: Familiarity with the ELK stack (Elasticsearch, Logstash, Kibana) for search and analytics solutions.</li>\n</ul>\n<ul>\n<li>Cloud expertise: Experience with AWS cloud services; familiarity with SageMaker; and CI/CD tooling such as GitHub Actions or Jenkins.</li>\n</ul>\n<ul>\n<li>Front-end expertise: Experience building user interfaces with Angular or a modern UI stack.</li>\n</ul>\n<ul>\n<li>Financial domain knowledge: Broad understanding of equities, fixed income, derivatives, futures, FX, and other financial instruments.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5c70414d-4e6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"FIC & Risk Technology","sameAs":"https://mlp.eightfold.ai","logo":"https://logos.yubhub.co/mlp.eightfold.ai.png"},"x-apply-url":"https://mlp.eightfold.ai/careers/job/755955321460","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","Snowflake","dbt","AWS","Terraform","modern UI technologies","data warehouse technology","SQL","unit testing","CI/CD automation","quality assurance processes"],"x-skills-preferred":["machine learning","AI","large language models","agentic frameworks","ELK stack","search and analytics solutions","cloud expertise","AWS cloud services","SageMaker","CI/CD tooling","front-end expertise","Angular","financial domain knowledge"],"datePosted":"2026-04-18T22:13:54.584Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bangalore, Karnataka, India"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Snowflake, dbt, AWS, Terraform, modern UI technologies, data warehouse technology, SQL, unit testing, CI/CD automation, quality assurance processes, machine learning, AI, large language models, agentic frameworks, ELK stack, search and analytics solutions, cloud expertise, AWS cloud services, SageMaker, CI/CD tooling, front-end expertise, Angular, financial domain knowledge"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8610ea3d-93b"},"title":"Cloud Platform Engineer","description":"<p>The Business Development/Management Technology team at FIC &amp; Risk Technology is building and operating platforms that support recruiting, hiring, and onboarding of investment professionals. We are currently integrating multiple legacy and new systems into a unified, cloud-native platform to standardize processes, workflows, and data models across the organisation.</p>\n<p>This integration will enable seamless collaboration between teams and provide reliable, scalable data for analytics and reporting. We are looking for a Cloud Platform Engineer to design, build, and operate our AWS-based infrastructure and data platforms, using modern DevOps practices, infrastructure as code, and secure, well-engineered services in Python and C#.</p>\n<p>The successful candidate will collaborate with global technology and business teams to design cloud-native solutions that support business development and onboarding workflows. They will partner with global stakeholders to understand requirements and translate them into secure, scalable AWS architectures and platform capabilities.</p>\n<p>Key responsibilities include leading the end-to-end delivery of cloud and platform features, including design, implementation (Python/C#), infrastructure as code, testing, and deployment using DevOps practices.</p>\n<p>We are looking for a highly skilled engineer with 6+ years of experience in software or platform engineering, with significant time spent building and operating solutions in cloud environments (AWS preferred).</p>\n<p>The ideal candidate will have strong hands-on programming experience in Python and C#, with solid understanding of object-oriented design, design patterns, service-oriented / microservices architectures, concurrency, and SOLID principles.</p>\n<p>They will also have proven experience designing and operating AWS-based platforms (e.g., EC2, ECS/EKS, Lambda, S3, RDS, IAM) using infrastructure as code (Terraform, CloudFormation, or CDK).</p>\n<p>In addition, the successful candidate will have practical experience implementing DevOps practices and CI/CD pipelines (e.g., Jenkins, GitHub Actions, Azure DevOps), including automated testing, security scanning, and deployment.</p>\n<p>Experience supporting data science and analytics platforms, including orchestration tools such as Airflow, distributed processing engines such as Spark, and cloud-native data pipelines is also required.</p>\n<p>Good understanding of SQL and core database concepts; familiarity with AWS analytics services (e.g., Glue, EMR, Redshift, Athena) is a plus.</p>\n<p>Awareness of cloud security best practices, including IAM, network security, data encryption, and secure configuration management is also necessary.</p>\n<p>Strong problem-solving and analytical skills; demonstrated ability to take ownership, deliver in a fast-paced environment, and collaborate effectively with global teams is essential.</p>\n<p>Excellent communication skills, with ability to work closely with both technical and non-technical stakeholders is also required.</p>\n<p>Experience estimating, monitoring, and optimizing AWS infrastructure costs, including use of tools such as AWS Cost Explorer, AWS Budgets, and cost-allocation tagging strategies is desirable.</p>\n<p>Experience designing and operating workloads across multiple cloud environments and on-premises, using centralized policies, governance, and controls to support business-aligned teams is also beneficial.</p>\n<p>Working knowledge of networking across on-premises and cloud environments, including VPC design, subnets, routing, VPNs/Direct Connect, load balancing, DNS, and network security controls is necessary.</p>\n<p>Nice to have experience with additional big data tools or platforms (e.g., Kafka, Databricks, Snowflake, Flink).</p>\n<p>Familiarity with Capital Markets concepts and operating models is also beneficial.</p>\n<p>The estimated base salary range for this position is $175,000 to $250,000, which is specific to New York and may change in the future.</p>\n<p>Millennium pays a total compensation package which includes a base salary, discretionary performance bonus, and a comprehensive benefits package.</p>\n<p>When finalising an offer, we take into consideration an individual&#39;s experience level and the qualifications they bring to the role to formulate a competitive total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8610ea3d-93b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"FIC & Risk Technology","sameAs":"https://mlp.eightfold.ai","logo":"https://logos.yubhub.co/mlp.eightfold.ai.png"},"x-apply-url":"https://mlp.eightfold.ai/careers/job/755955139979","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$175,000 to $250,000","x-skills-required":["AWS","Python","C#","DevOps","Infrastructure as Code","Cloud Security","SQL","Database Concepts","Networking"],"x-skills-preferred":["Airflow","Spark","Kafka","Databricks","Snowflake","Flink","Capital Markets"],"datePosted":"2026-04-18T22:12:50.548Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States of America"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"AWS, Python, C#, DevOps, Infrastructure as Code, Cloud Security, SQL, Database Concepts, Networking, Airflow, Spark, Kafka, Databricks, Snowflake, Flink, Capital Markets","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":175000,"maxValue":250000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6365e7d7-511"},"title":"Senior Forward Deployed Data Scientist/Engineer","description":"<p>We&#39;re hiring a Senior Forward Deployed Data Scientist / Engineer to work directly with customers on ambiguous, high-impact problems at the intersection of data science, product development, and AI deployment.</p>\n<p>This is not a traditional analytics role. On this team, data scientists do the core statistical and modeling work, but they also build real tools and products: evaluation explorers, operator workflows, decision-support systems, experimentation surfaces, and customer-specific AI/data applications that get used in production.</p>\n<p>The right candidate is strong in first-principles problem solving, rigorous measurement, and technical execution. They know how to define metrics, design experiments, diagnose failures, and build systems that people actually use. They are also comfortable using modern AI-assisted development tools to prototype and iterate quickly without sacrificing reliability, observability, or judgment. Python and SQL matter in this role, but as execution fluency in service of building better products and making better decisions.</p>\n<p>Responsibilities: Partner directly with enterprise customers to understand workflows, operational pain points, constraints, and success criteria Turn ambiguous business and product problems into measurable solutions with clear metrics, technical designs, and deployment plans Design and build internal and customer-facing data products, including evaluation tools, workflow applications, decision-support systems, and thin product layers on top of data/ML systems Build end-to-end solutions across data ingestion, transformation, experimentation, statistical modeling, deployment, monitoring, and iteration Design evaluation frameworks, benchmarks, and feedback loops for ML/LLM systems, human-in-the-loop workflows, and model-assisted operations Apply rigorous statistical thinking to experimentation, causal inference, metric design, forecasting, segmentation, diagnostics, and performance measurement Use AI-assisted development workflows to accelerate prototyping and product iteration, while maintaining strong engineering discipline Diagnose failure modes across data quality, model behavior, retrieval, workflow design, and user experience, and drive fixes into production Act as the voice of the customer to Product, Engineering, and Data Science, using field learnings to shape roadmap and platform capabilities</p>\n<p>Requirements: 5+ years of experience in data science, machine learning, quantitative engineering, or another highly analytical technical role Proven track record of shipping data, ML, or AI systems that delivered measurable business or product impact Exceptional ability to structure ambiguous problems, define the right success metrics, and translate them into executable technical plans Strong foundation in statistics, experimentation, causal reasoning, and measurement Experience building tools or products, not just analyses , for example internal workflow tools, evaluation systems, operator-facing products, experimentation platforms, or customer-specific applications Hands-on fluency in Python, SQL, and modern data/AI tooling; able to inspect data, prototype quickly, debug deeply, and productionize solutions that work Comfort using AI-assisted coding and development workflows to move from idea to usable product quickly Strong communication and stakeholder management skills; able to work effectively with customers, engineers, product teams, and executives High ownership and bias toward shipping in fast-moving environments with incomplete information</p>\n<p>Preferred qualifications: Experience in a forward deployed, solutions, consulting, or other client-facing technical role Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign</p>\n<p>What success looks like: Success in this role means taking a messy, high-stakes customer problem and turning it into a deployed system that is actually used. Sometimes that system is a model. Sometimes it is an evaluation framework. Sometimes it is an operator-facing tool or a lightweight data product that changes how decisions get made. In all cases, success is defined by measurable impact, rigorous evaluation, and reliable execution.</p>\n<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training. Scale employees in eligible roles are also granted equity based compensation, subject to Board of Director approval. Your recruiter can share more about the specific salary range for your preferred location during the hiring process, and confirm whether the hired role will be eligible for equity grant. You’ll also receive benefits including, but not limited to: Comprehensive health, dental and vision coverage, retirement benefits, a learning and development stipend, and generous PTO. Additionally, this role may be eligible for additional benefits such as a commuter stipend.</p>\n<p>Salary Range: $167,200-$209,000 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6365e7d7-511","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale AI","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4636227005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$167,200-$209,000 USD","x-skills-required":["Python","SQL","Modern data/AI tooling","Statistics","Experimentation","Causal reasoning","Measurement","Data science","Machine learning","Quantitative engineering"],"x-skills-preferred":["Experience in a forward deployed, solutions, consulting, or other client-facing technical role","Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products","Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow","Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery","Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems","Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling","Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign"],"datePosted":"2026-04-18T15:59:44.618Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA; New York, NY"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, SQL, Modern data/AI tooling, Statistics, Experimentation, Causal reasoning, Measurement, Data science, Machine learning, Quantitative engineering, Experience in a forward deployed, solutions, consulting, or other client-facing technical role, Experience designing evaluation frameworks for LLMs, retrieval systems, agentic workflows, or other AI-enabled products, Experience with large-scale data processing and distributed systems such as Spark, Ray, or Airflow, Experience with cloud infrastructure and modern data platforms such as AWS, GCP, Snowflake, or BigQuery, Experience building lightweight applications, APIs, internal tools, or workflow software on top of data/ML systems, Familiarity with marketplace experimentation, causal inference, forecasting, optimization, or advanced statistical modeling, Strong product instinct and the judgment to know when the right answer is a model, an experiment, a tool, or a workflow redesign","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":167200,"maxValue":209000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b68ff4cc-e74"},"title":"Data Engineer, Safeguards","description":"<p><strong>About the role</strong></p>\n<p>Anthropic is looking for a Data Engineer to join the Safeguards team and build the data foundations that keep our AI systems safe. The Safeguards team works to monitor models, prevent misuse, and ensure user well-being.</p>\n<p>You&#39;ll design and build the data pipelines, warehousing solutions, and analytical tooling that power our safety and trust efforts at scale. You&#39;ll work closely with engineers, data scientists, and policy teams to ensure the Safeguards organization has the data it needs to detect abuse patterns, measure the effectiveness of safety interventions, and make informed decisions about model behavior and enforcement.</p>\n<p>This is a high-impact role where your work will directly support Anthropic&#39;s mission to develop AI that is safe and beneficial.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design, build, and maintain scalable data pipelines that support safety monitoring, abuse detection, and enforcement workflows</li>\n<li>Develop and optimize data models and warehousing solutions to enable efficient analysis of large-scale usage and safety data</li>\n<li>Build and maintain dashboards and reporting infrastructure that give Safeguards teams visibility into model behavior, misuse patterns, and enforcement outcomes</li>\n<li>Collaborate with engineers to integrate data from multiple sources , including model outputs, user reports, and automated classifiers , into a unified analytical layer</li>\n<li>Implement data quality frameworks, monitoring, and alerting to ensure the reliability of safety-critical data</li>\n<li>Partner with research teams to surface data insights that inform model improvements and safety interventions</li>\n<li>Develop self-service data tooling that enables stakeholders to explore safety data and generate reports independently</li>\n<li>Contribute to data governance practices, including access controls, retention policies, and privacy-compliant data handling</li>\n</ul>\n<p><strong>You may be a good fit if you:</strong></p>\n<ul>\n<li>Have 3+ years of experience in data engineering, analytics engineering, or a related role</li>\n<li>Are proficient in SQL and Python, with experience building and maintaining ETL/ELT pipelines</li>\n<li>Have hands-on experience with modern data stack tools such as dbt, Airflow, Spark, or similar orchestration and transformation frameworks</li>\n<li>Have worked with cloud data platforms (BigQuery, Redshift, Snowflake, or similar)</li>\n<li>Are comfortable building dashboards and data visualizations using tools like Looker, Tableau, or Metabase</li>\n<li>Communicate clearly and can translate complex data concepts for both technical and non-technical audiences</li>\n<li>Are results-oriented, flexible, and willing to pick up slack even when it falls outside your job description</li>\n<li>Care about the societal impacts of AI and are motivated by safety work</li>\n</ul>\n<p><strong>Strong candidates may have:</strong></p>\n<ul>\n<li>Experience with trust &amp; safety, integrity, fraud, or abuse detection data systems</li>\n<li>Experience with large-scale event streaming systems (Kafka, Pub/Sub, Kinesis)</li>\n<li>Built data infrastructure that supports ML model monitoring or evaluation</li>\n<li>A background in statistical analysis, or experience collaborating closely with data scientists</li>\n<li>Developed internal tooling or self-service analytics platforms</li>\n</ul>\n<p><strong>Strong candidates need not have:</strong></p>\n<ul>\n<li>A formal degree in Computer Science or a related field , we value practical experience and demonstrated ability over credentials</li>\n<li>Prior experience in AI or machine learning , you&#39;ll learn the domain-specific context on the job</li>\n<li>Previous experience at an AI safety or research organization</li>\n<li>Deep expertise across every tool listed above , familiarity with a subset and a willingness to learn is enough</li>\n</ul>\n<p><strong>Logistics</strong></p>\n<p>Minimum education: Bachelor’s degree or an equivalent combination of education, training, and/or experience Required field of study: A field relevant to the role as demonstrated through coursework, training, or professional experience Minimum years of experience: Years of experience required will correlate with the internal job level requirements for the position Location-based hybrid policy: Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices. Visa sponsorship: We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>\n<p><strong>How we&#39;re different</strong></p>\n<p>We believe that the highest-impact AI research will be big science. At Anthropic we work as a single cohesive team on just a few large-scale research efforts. And we value impact , advancing our long-term goals of steerable, trustworthy AI , rather than work on smaller and more specific puzzles. We view AI research as an empirical science, which has as much in common with physics and biology as with traditional efforts in computer science. We&#39;re an extremely collaborative group, and we host frequent research discussions to ensure that we are pursuing the highest-impact work at any given time. As such, we greatly value communication skills. The easiest way to understand our research directions is to read our recent research. This research continues many of the directions our team worked on prior to Anthropic, including: GPT-3, Circuit-Based Interpretability, Multimodal Neurons, Scaling Laws, AI &amp; Compute, Concrete Problems in AI Safety, and Learning from Human Preferences.</p>\n<p><strong>Come work with us!</strong></p>\n<p>Anthropic is a public benefit corporation headquartered in San Francisco. We offer competitive compensation and benefits, optional equity donation matching, generous vacation and parental leave, flexible working hours, and a lovely office space in which to collaborate with colleagues.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b68ff4cc-e74","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.com/","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5156057008","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"£170,000-£220,000 GBP","x-skills-required":["SQL","Python","ETL/ELT pipelines","dbt","Airflow","Spark","cloud data platforms","BigQuery","Redshift","Snowflake","Looker","Tableau","Metabase"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:59:33.960Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, UK"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, ETL/ELT pipelines, dbt, Airflow, Spark, cloud data platforms, BigQuery, Redshift, Snowflake, Looker, Tableau, Metabase","baseSalary":{"@type":"MonetaryAmount","currency":"GBP","value":{"@type":"QuantitativeValue","minValue":170000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_28d86251-85e"},"title":"Senior Data Analyst, Marketing Analytics","description":"<p>As a Senior Data Analyst, Marketing Analytics at GitLab, you&#39;ll serve as a strategic analytics partner to Marketing leadership and help shape how we measure, model, and improve marketing performance.</p>\n<p>Reporting to the Senior Manager of Marketing Analytics, you&#39;ll take ownership of high-impact analytical work across attribution, inbound funnel analysis, target setting, and campaign measurement.</p>\n<p>You&#39;ll work across the full data lifecycle, from building and maintaining dbt models in Snowflake that power business intelligence, to delivering executive-ready insights, to partnering cross-functionally to support important business decisions.</p>\n<p>In this role, you&#39;ll use a modern analytics and AI-enabled workflow that includes Snowflake, Claude with MCP connections, and GitLab Duo Agent Platform in git-based team workflows.</p>\n<p>This is a high-visibility role for someone with strong SQL skills, sound judgment in B2B software marketing analytics, and the ability to turn complex data into clear stories for senior stakeholders in our all-remote, values-driven environment.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Serve as a trusted advisor to senior Marketing and Developer Relations leaders, providing data-driven insights that inform marketing strategy, product development, and go-to-market decisions.</li>\n</ul>\n<ul>\n<li>Establish and scale proactive insights across the Marketing function, including experimentation and connecting marketing credit from initial touchpoints to annual recurring revenue.</li>\n</ul>\n<ul>\n<li>Measure the impact of community programs, developer advocacy, and evangelism efforts on awareness, adoption, and pipeline.</li>\n</ul>\n<ul>\n<li>Evaluate the effectiveness of demand generation programs, including trial conversion analysis across segments, lead scoring optimization, and outreach platform performance.</li>\n</ul>\n<ul>\n<li>Partner closely with Analytics Engineering and Data Platform teams to translate business questions into clear technical requirements and support reliable delivery in the analytics stack.</li>\n</ul>\n<ul>\n<li>Help evolve GitLab&#39;s multi-touch attribution approach, including attributed metric definitions and implementation across reporting.</li>\n</ul>\n<ul>\n<li>Contribute to annual planning cycles through partnership with Sales, Finance, and Product leadership.</li>\n</ul>\n<ul>\n<li>Use AI-enabled analytics workflows through GitLab Duo Agent Platform, Claude with MCP integrations, and Glean within GitLab to improve how insights are developed and shared.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>5+ years of experience in a data analyst or analytics engineer role, preferably in B2B software marketing analytics or in a large-scale analytics consultancy supporting similar companies.</li>\n</ul>\n<ul>\n<li>Strong proficiency in SQL, including the ability to connect disparate data sources and build a clear view of marketing program performance in a cloud data warehouse, ideally Snowflake.</li>\n</ul>\n<ul>\n<li>Experience working with senior leaders and tailoring data-driven narratives to support decision-making.</li>\n</ul>\n<ul>\n<li>Hands-on experience with dbt for data transformation, testing, and documentation.</li>\n</ul>\n<ul>\n<li>Strong understanding of B2B marketing funnel metrics, including marketing qualified leads, sales accepted opportunities, pipeline, multi-touch attribution, and conversion rate analysis.</li>\n</ul>\n<ul>\n<li>Experience working with marketing automation platforms such as Marketo and customer relationship management systems such as Salesforce, including data integration patterns and sync logic.</li>\n</ul>\n<ul>\n<li>Proficiency with a business intelligence and visualization tool, with Tableau preferred.</li>\n</ul>\n<ul>\n<li>Excellent written and verbal communication skills, with the ability to distill complex analysis into clear narratives for technical and non-technical audiences.</li>\n</ul>\n<ul>\n<li>Experience with AI-assisted analytics development, including MCP connections and agentic workflows.</li>\n</ul>\n<ul>\n<li>Familiarity with Git workflows for version control, continuous integration and continuous delivery, issue tracking, and merge request workflows.</li>\n</ul>\n<ul>\n<li>Comfort working in a remote, async-first, globally distributed environment.</li>\n</ul>\n<p><strong>About the Team</strong></p>\n<p>The Marketing Analytics team sits within GitLab&#39;s Enterprise Data organization and partners closely with the broader Marketing team as well as other departments across GitLab.</p>\n<p>We build and maintain trusted, scalable data products that help inform strategic decisions for Marketing while also creating alignment with related business metrics, especially across Product and Sales.</p>\n<p>You&#39;ll join a fully remote team that collaborates asynchronously across time zones using our shared standards, code review, and documentation to support consistency and quality.</p>\n<p>We work on analytically complex, high-impact problems where strong data foundations, clear definitions, and thoughtful cross-functional partnerships are essential to helping stakeholders understand performance and make better decisions.</p>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Benefits to support your health, finances, and well-being</li>\n</ul>\n<ul>\n<li>Flexible Paid Time Off</li>\n</ul>\n<ul>\n<li>Team Member Resource Groups</li>\n</ul>\n<ul>\n<li>Equity Compensation &amp; Employee Stock Purchase Plan</li>\n</ul>\n<ul>\n<li>Growth and Development Fund</li>\n</ul>\n<ul>\n<li>Parental leave</li>\n</ul>\n<ul>\n<li>Home office support</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_28d86251-85e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"GitLab","sameAs":"https://about.gitlab.com/","logo":"https://logos.yubhub.co/about.gitlab.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/gitlab/jobs/8472178002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$94,100-$201,600 USD","x-skills-required":["SQL","dbt","Snowflake","Tableau","Marketo","Salesforce","AI-assisted analytics development","MCP connections","agentic workflows","Git workflows","version control","continuous integration and continuous delivery","issue tracking","merge request workflows"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:58:57.489Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, North America"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, dbt, Snowflake, Tableau, Marketo, Salesforce, AI-assisted analytics development, MCP connections, agentic workflows, Git workflows, version control, continuous integration and continuous delivery, issue tracking, merge request workflows","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":94100,"maxValue":201600,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_477935cf-ac5"},"title":"Senior Strategic Partner Manager, Solutions","description":"<p>We&#39;re looking for a Senior Strategic Partner Manager, Solutions to join our team. As a key member of our Partner organization, you will play a critical role in building a global Solutions Partner Program that equips ZoomInfo&#39;s top implementation partners to deliver successful project outcomes and ensure ongoing customer adoption.</p>\n<p>Your primary responsibilities will include designing and managing the global Solutions Partner Program and methodology, working with the partner team, delivery and account teams to ensure the proper program elements, resources, and processes are in place to support solutions partner&#39;s success, providing full program strategy, project management and timely updates on key solutions partner success initiatives, and being the trusted advisor for ZoomInfo solutions partners.</p>\n<p>You will also work collaboratively with Partners and internal ZoomInfo delivery and technical experts to develop repeatable frameworks built from successful customer deployments, collect qualitative and quantitative data points to measure and report on individual partner performance based upon key metrics (KPIs) to ensure a high standard of implementation quality from our top partners, and play an active role in contributing to the evolution of ZoomInfo’s overall partner program and strategy.</p>\n<p>Additionally, you will architect and manage partner business planning, QBRs, assessments, etc., own and manage partner interactions with ZoomInfo Team (Marketing, Product, Pre-Sales, Sales, Services, Enablement, Customer Success, Partner Operations, and Executive Leadership), handle administrative functions related to Partner Account and ensure internal tools are updated and sales hygiene is maintained, and support Partners and internal stakeholders’ ad-hoc requests and jump in where needed.</p>\n<p>Core systems and tools you may be working with include Salesforce, Jira, Confluence, GSuite, Netsuite, Snowflake, GCP, AWS, plus multiple other peripheral software tools in these ecosystems.</p>\n<p>Requirements include 5-8 years of experience working with and handling solutions partners, confirmed track records of sales over-performance, existing SI partner relationships and network, strategic business and marketing planning capabilities, excellent interpersonal skills and a confirmed capacity to build positive relationships and close business with partners, proven ability to work cross-functionally, and self-motivation, strong self-management skills, and leadership qualities.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_477935cf-ac5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"ZoomInfo","sameAs":"https://www.zoominfo.com","logo":"https://logos.yubhub.co/zoominfo.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/zoominfo/jobs/8441280002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$105,350-$165,550 USD","x-skills-required":["Sales","Partner Management","Program Management","Project Management","Strategic Planning","Business Development","Marketing","Product","Pre-Sales","Services","Enablement","Customer Success","Partner Operations","Executive Leadership","Salesforce","Jira","Confluence","GSuite","Netsuite","Snowflake","GCP","AWS"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:58:37.592Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Waltham, Massachusetts, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Sales, Partner Management, Program Management, Project Management, Strategic Planning, Business Development, Marketing, Product, Pre-Sales, Services, Enablement, Customer Success, Partner Operations, Executive Leadership, Salesforce, Jira, Confluence, GSuite, Netsuite, Snowflake, GCP, AWS","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":105350,"maxValue":165550,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_81b2e2ee-c36"},"title":"Manager, Solutions Engineering","description":"<p>About Mixpanel</p>\n<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>\n<p>As a Manager on the Solutions Engineering team at Mixpanel, you will lead a talented group of analytics consultants who are pivotal to our success. You will be at the forefront of driving customer value, guiding your team as they serve as the primary technical resources for our Sales organisation.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Develop &amp; Mentor: Lead, coach, and grow a high-performing and inclusive team of Solutions Engineers, actively investing in their career development and upholding a high standard of performance.</li>\n</ul>\n<ul>\n<li>Drive Results: Partner closely with Sales leadership and Account Executives to provide technical expertise that drives new, retained, and expansion ARR. You will ensure your team&#39;s activities are directly contributing to the company&#39;s bottom line.</li>\n</ul>\n<ul>\n<li>Prioritise &amp; Problem Solve: Guide your team through complex customer evaluations and technical challenges. You will manage team resources effectively, aligning the right skills to customer needs to achieve productivity targets and successful outcomes.</li>\n</ul>\n<ul>\n<li>Cross-Functional Partnership: Act as a key technical liaison, collaborating with peer managers across Sales, Product, and Engineering. You will gather and synthesise customer feedback from your team to influence product strategy and solve problems at scale.</li>\n</ul>\n<ul>\n<li>Communicate &amp; Manage Change: Effectively translate broader company and departmental strategy into clear, actionable goals for your team. You will guide your direct reports through evolving business priorities with empathy and clarity.</li>\n</ul>\n<ul>\n<li>Hire the Best: Actively assess the needs of the team, build a pipeline of top talent, and hire outstanding individuals who elevate the team&#39;s capabilities and contribute to our inclusive culture.</li>\n</ul>\n<ul>\n<li>Innovate &amp; Raise the Bar: Relentlessly seek to improve how your team operates, from refining demo strategies and proof-of-concept methodologies to adopting new tools and processes that increase effectiveness and celebrate success.</li>\n</ul>\n<p>We&#39;re Looking For Someone Who</p>\n<ul>\n<li>Has progressive experience in a B2B SaaS environment, including 3+ years of people management experience leading a technical pre-sales, solutions engineering, or professional services team.</li>\n</ul>\n<ul>\n<li>Exhibits a &#39;player-coach&#39; mentality with deep knowledge in the data and analytics space. You are an expert on how data products (like CDPs, data warehouses, and analytics tools) are implemented and adopted by customers.</li>\n</ul>\n<ul>\n<li>Is a proven cross-functional partner with a track record of successfully working with sales teams to navigate complex deals and drive revenue.</li>\n</ul>\n<ul>\n<li>Demonstrates expertise in communicating complex technical concepts clearly and effectively to both technical and non-technical stakeholders.</li>\n</ul>\n<ul>\n<li>Is skilled at prioritising team activities and managing workload in a dynamic environment, balancing customer needs with efficiency goals.</li>\n</ul>\n<ul>\n<li>Is a natural mentor and developer of talent, with a passion for coaching and a history of building inclusive, high-achieving teams.</li>\n</ul>\n<ul>\n<li>Handles ambiguity with ease, demonstrating flexibility and a proactive, problem-solving mindset when adapting to new challenges and business priorities.</li>\n</ul>\n<ul>\n<li>Actively seeks feedback and is humble to learn, consistently looking for ways to improve themselves and their team.</li>\n</ul>\n<p>Bonus Points</p>\n<ul>\n<li>Previous experience in management consulting, strategic operations, or a similar role focused on go-to-market strategy.</li>\n</ul>\n<ul>\n<li>Direct, hands-on experience with Mixpanel or other product analytics tools like Amplitude, Pendo, or Contentsquare.</li>\n</ul>\n<ul>\n<li>Strong familiarity with the modern data stack, including tools like Snowflake, Google BigQuery, Segment, or Hightouch.</li>\n</ul>\n<p>#LI-Hybrid</p>\n<p>Compensation</p>\n<p>The amount listed below is the total target cash compensation (TTCC) and includes base compensation and variable compensation in the form of either a company bonus or commissions. Variable compensation type is determined by your role and level. In addition to the cash compensation provided, this position is also eligible for equity consideration and other benefits including medical, vision, and dental insurance coverage.</p>\n<p>Our salary ranges are determined by role and level and are benchmarked to the SF Bay Area Technology data cut released by Radford, a global compensation database. The range displayed represents the minimum and maximum TTCC for new hire salaries for the position across all of our US locations. To stay on top of market conditions, we refresh our salary ranges twice a year so these ranges may change in the future. Within the range, individual pay is determined by experience, job-related skills, qualifications, and other factors.</p>\n<p>If you have questions about the specific range, your recruiter can share this information.</p>\n<p>Mixpanel Compensation Range $238,300-$321,705 USD</p>\n<p>Benefits and Perks</p>\n<ul>\n<li>Comprehensive Medical, Vision, and Dental Care</li>\n</ul>\n<ul>\n<li>Mental Wellness Benefit</li>\n</ul>\n<ul>\n<li>Generous Vacation Policy &amp; Additional Company Holidays</li>\n</ul>\n<ul>\n<li>Enhanced Parental Leave</li>\n</ul>\n<ul>\n<li>Volunteer Time Off</li>\n</ul>\n<ul>\n<li>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</li>\n</ul>\n<p>Culture Values</p>\n<ul>\n<li>Make Bold Bets: We choose courageous action over comfortable progress.</li>\n</ul>\n<ul>\n<li>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</li>\n</ul>\n<ul>\n<li>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</li>\n</ul>\n<ul>\n<li>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</li>\n</ul>\n<ul>\n<li>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</li>\n</ul>\n<ul>\n<li>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</li>\n</ul>\n<p>Why choose Mixpanel?</p>\n<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital. Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviours and easily track overarching company success metrics.</p>\n<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service. Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>\n<p>Mixpanel is an equal opportunity employer supporting workforce diversity. At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most important thing.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_81b2e2ee-c36","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mixpanel","sameAs":"https://mixpanel.com","logo":"https://logos.yubhub.co/mixpanel.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mixpanel/jobs/7513876","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$238,300-$321,705 USD","x-skills-required":["data and analytics space","data products","CDPs","data warehouses","analytics tools","complex technical concepts","team activities","workload management","customer needs","efficiency goals","ambiguity","flexibility","problem-solving mindset","product analytics tools","Amplitude","Pendo","Contentsquare","modern data stack","Snowflake","Google BigQuery","Segment","Hightouch"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:58:04.835Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, US (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data and analytics space, data products, CDPs, data warehouses, analytics tools, complex technical concepts, team activities, workload management, customer needs, efficiency goals, ambiguity, flexibility, problem-solving mindset, product analytics tools, Amplitude, Pendo, Contentsquare, modern data stack, Snowflake, Google BigQuery, Segment, Hightouch","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":238300,"maxValue":321705,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_653fb770-5c8"},"title":"Senior Revenue Accountant","description":"<p>Job Title: Senior Revenue Accountant</p>\n<p>Location: San Francisco, California</p>\n<p>Department: Finance &amp; Business Operations</p>\n<p>Job Description:</p>\n<p>Intercom is the AI Customer Service company on a mission to help businesses provide incredible customer experiences.</p>\n<p>Our AI agent Fin, the most advanced customer service AI agent on the market, lets businesses deliver always-on, impeccable customer service and ultimately transform their customer experiences for the better. Fin can also be combined with our Helpdesk to become a complete solution called the Intercom Customer Service Suite, which provides AI enhanced support for the more complex or high touch queries that require a human agent.</p>\n<p>Founded in 2011 and trusted by nearly 30,000 global businesses, Intercom is setting the new standard for customer service. Driven by our core values, we push boundaries, build with speed and intensity, and consistently deliver incredible value to our customers.</p>\n<p>What&#39;s the opportunity?</p>\n<p>This position will play a key role in supporting and scaling Intercom’s revenue accounting operations. You will work closely with Revenue, Finance Systems, Data, and GTM teams to ensure accurate, efficient, and scalable revenue processes as the business grows.</p>\n<p>This role is highly operational and detail-oriented, with a focus on owning complex revenue workflows, improving visibility into in-month activity, and ensuring the accuracy of automated processes. You’ll play a critical role in validating automation outputs, strengthening controls, and reducing risk across key revenue streams,including enterprise contracts and usage-based revenue.</p>\n<p>We are seeking a motivated and detail-oriented individual with strong revenue accounting fundamentals who is excited to improve processes, leverage automation and AI tools, and operate in a fast-paced SaaS environment.</p>\n<p>What will I be doing?</p>\n<ul>\n<li>Own and execute key components of the monthly revenue close process, ensuring accuracy and timeliness.</li>\n</ul>\n<ul>\n<li>Manage complex revenue workflows, including ramped contracts, usage-based revenue, and other non-standard arrangements.</li>\n</ul>\n<ul>\n<li>Track and monitor revenue transactions throughout the month to improve visibility and reduce month-end close spikes.</li>\n</ul>\n<ul>\n<li>Own and prepare reconciliations for revenue-related accounts, ensuring completeness and accuracy.</li>\n</ul>\n<ul>\n<li>Perform detailed variance and flux analysis, identifying key drivers and investigating discrepancies.</li>\n</ul>\n<ul>\n<li>Serve as UAT lead for automation and systems initiatives impacting revenue accounting.</li>\n</ul>\n<ul>\n<li>Validate system outputs and ensure revenue recognition is accurate and compliant with ASC 606.</li>\n</ul>\n<ul>\n<li>Build and maintain controls around automated processes to ensure data integrity and audit readiness.</li>\n</ul>\n<ul>\n<li>Partner closely with Finance Systems and Data teams to enhance revenue systems, data flows, and reporting.</li>\n</ul>\n<ul>\n<li>Support accounting for enterprise contracts and ensure proper treatment of complex deal structures.</li>\n</ul>\n<ul>\n<li>Assist in operationalizing usage-based revenue models, including validation of data inputs and revenue outputs.</li>\n</ul>\n<ul>\n<li>Contribute to process improvements and automation initiatives, including leveraging AI tools to streamline workflows and reduce manual effort.</li>\n</ul>\n<ul>\n<li>Support audit requests and maintain documentation for key revenue processes and controls.</li>\n</ul>\n<p>What skills do I need?</p>\n<ul>\n<li>Bachelor’s degree in Accounting or Finance; CPA (or in progress) preferred.</li>\n</ul>\n<ul>\n<li>5+ years of accounting experience, with exposure to revenue accounting (SaaS experience preferred).</li>\n</ul>\n<ul>\n<li>Strong understanding of U.S. GAAP and foundational knowledge of ASC 606.</li>\n</ul>\n<ul>\n<li>Experience supporting month-end close processes, including reconciliations and variance analysis.</li>\n</ul>\n<ul>\n<li>Experience with NetSuite (NetSuite ARM experience is a plus).</li>\n</ul>\n<ul>\n<li>Familiarity with Stripe, Salesforce (SFDC), and Snowflake is a plus.</li>\n</ul>\n<ul>\n<li>Experience with SaaS and/or usage-based revenue models preferred.</li>\n</ul>\n<ul>\n<li>Strong attention to detail and ability to manage and track high volumes of transactions.</li>\n</ul>\n<ul>\n<li>Analytical mindset with the ability to investigate issues and ensure data accuracy.</li>\n</ul>\n<ul>\n<li>Interest in automation and experience working with or validating system-driven processes.</li>\n</ul>\n<ul>\n<li>Excitement about leveraging AI tools to improve accounting workflows and efficiency.</li>\n</ul>\n<ul>\n<li>Strong communication and collaboration skills, with the ability to work cross-functionally.</li>\n</ul>\n<ul>\n<li>Organized, proactive, and able to operate effectively in a fast-paced, high-growth environment.</li>\n</ul>\n<p>Benefits</p>\n<p>We are a well-treated bunch with awesome benefits! If there’s something important to you that’s not on this list, talk to us!</p>\n<ul>\n<li>Competitive salary and meaningful equity</li>\n</ul>\n<ul>\n<li>Comprehensive medical, dental, and vision coverage</li>\n</ul>\n<ul>\n<li>Regular compensation reviews - great work is rewarded!</li>\n</ul>\n<ul>\n<li>Unlimited access to Claude Code and best-in-class AI tools; experimentation &amp; building is encouraged &amp; celebrated.</li>\n</ul>\n<ul>\n<li>Flexible paid time off policy</li>\n</ul>\n<ul>\n<li>Paid Parental Leave Program</li>\n</ul>\n<ul>\n<li>401k plan &amp; match</li>\n</ul>\n<ul>\n<li>In-office bicycle storage</li>\n</ul>\n<ul>\n<li>Fun events for Intercomrades, friends, and family!</li>\n</ul>\n<p>*Proof of eligibility to work in the United States is required.</p>\n<p>The base salary range for candidates within the San Francisco Bay Area is $147,000-$180,000. Actual base pay will depend on a variety of factors such as education, skills, experience, location, etc. The base pay range is subject to change and may be modified in the future. All regular employees may also be eligible for the corporate bonus program or a sales incentive (target included in OTE) as well as stock in the form of Restricted Stock Units (RSUs).</p>\n<p>#LI-Hybrid</p>\n<p>Policies</p>\n<p>Intercom has a hybrid working policy. We believe that working in person helps us stay connected, collaborate easier and create a great culture while still providing flexibility to work from home. We expect employees to be in the office at least three days per week.</p>\n<p>We have a radically open and accepting culture at Intercom. We avoid spending time on divisive subjects to foster a safe and cohesive work environment for everyone. As an organization, our policy is to not advocate on behalf of the company or our employees on any social or political topics out of our internal or external communications. We respect personal opinion and expression on these topics on personal social platforms on personal time, and do not challenge or confront anyone for their views on non-work related topics. Our goal is to focus on doing incredible work to achieve our goals and unite the company through our core values.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_653fb770-5c8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Intercom","sameAs":"https://www.intercom.com/","logo":"https://logos.yubhub.co/intercom.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/intercom/jobs/7774299","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$147,000-$180,000","x-skills-required":["Accounting","Revenue Accounting","U.S. GAAP","ASC 606","NetSuite","Stripe","Salesforce","Snowflake","SaaS","Usage-Based Revenue Models"],"x-skills-preferred":["Automation","AI Tools","Process Improvement","Data Analysis","Communication","Collaboration"],"datePosted":"2026-04-18T15:57:54.596Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"Accounting, Revenue Accounting, U.S. GAAP, ASC 606, NetSuite, Stripe, Salesforce, Snowflake, SaaS, Usage-Based Revenue Models, Automation, AI Tools, Process Improvement, Data Analysis, Communication, Collaboration","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":147000,"maxValue":180000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_dafcc67c-e10"},"title":"Senior Accountant","description":"<p>Ready to be pushed beyond what you think you’re capable of?</p>\n<p>At Coinbase, our mission is to increase economic freedom in the world.</p>\n<p>We&#39;re seeking a Senior Accountant to join our UK Finance team.</p>\n<p>As a Senior Accountant, you&#39;ll own daily finance operations and have input and review to sharpen forecasts, model scenarios, and turn raw data into crisp, board-ready insights.</p>\n<p>You&#39;ll play a pivotal role in scaling Coinbase’s presence in the UK and wider EMEA region through new licence applications, new product launches, and governance.</p>\n<p>This is a high-impact position designed for a versatile finance professional who can navigate the complexities of a regulated fintech environment while driving long-term efficiency through automation.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Ensure compliance with internal controls, policies, and procedures over platform accounting activities</li>\n</ul>\n<ul>\n<li>Collaborate with cross-functional partners to operationalize new areas of the business to facilitate streamlined accounting transactions as assigned</li>\n</ul>\n<ul>\n<li>Analyze data against business drivers to identify trends / issues</li>\n</ul>\n<ul>\n<li>Provide support for internal and external audits related to specific areas of responsibility</li>\n</ul>\n<ul>\n<li>Maintain up-to-date process documentation and procedures as assigned</li>\n</ul>\n<ul>\n<li>Contribute to automation and efficiency initiatives to streamline reconciliations</li>\n</ul>\n<ul>\n<li>Assist on preparation and/or review of month-end and quarter-end close, own UK IFRS books of accounts and reconciliations, maintain major accounting policies, and adjust financial statements as required.</li>\n</ul>\n<ul>\n<li>Oversee the monthly US GAAP to local IFRS accounting adjustments.</li>\n</ul>\n<ul>\n<li>Supporting regulatory reporting requirements.</li>\n</ul>\n<ul>\n<li>Monitor and oversee outsourced finance services, ensuring performance meets agreed SLAs/KPIs.</li>\n</ul>\n<ul>\n<li>Supporting the Global finance function as part of the month end close cycle.</li>\n</ul>\n<ul>\n<li>Support ad hoc financial information requests from the UK Head of Finance and other stakeholders as required.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>5+ years working in public accounting or in the fintech/financial services industries preferred.</li>\n</ul>\n<ul>\n<li>Experience working in a regulated environment.</li>\n</ul>\n<ul>\n<li>BA/BS in Accounting or related field, or ACCA/ACA preferred.</li>\n</ul>\n<ul>\n<li>Basic knowledge/understanding of payment systems and cash reconciliation processes at a large financial institution.</li>\n</ul>\n<ul>\n<li>Basic knowledge of blockchain technology and the crypto economy.</li>\n</ul>\n<ul>\n<li>Possess excellent analytical skills business partnering, problem solving, and prioritization skills.</li>\n</ul>\n<ul>\n<li>Able to work well in a dynamic environment and be able to recommend and implement process improvements.</li>\n</ul>\n<ul>\n<li>Excellent communication skills, both written and verbal.</li>\n</ul>\n<ul>\n<li>Strong work ethic and team player.</li>\n</ul>\n<p>AI Requirements:</p>\n<ul>\n<li>Demonstrates the ability to responsibly use generative AI tools and copilots (e.g., LibreChat, Gemini, Glean) in daily workflows, continuously learn as tools evolve, and apply human-in-the-loop practices to deliver business-ready outputs and drive measurable improvements in efficiency, cost, and quality.</li>\n</ul>\n<p>Nice to haves:</p>\n<ul>\n<li>Experience working in Netsuite, FloQast, and G-Suite.</li>\n</ul>\n<ul>\n<li>Basic knowledge of SQL and experience working with Snowflake, Looker, or similar analytics tools.</li>\n</ul>\n<ul>\n<li>Basic knowledge of Client Asset Safeguarding.</li>\n</ul>\n<p>Pay Transparency Notice: The target annual base salary for this position can range as detailed below. Total compensation may also include equity and bonus eligibility and benefits (including medical, dental, and vision).</p>\n<p>Annual base salary range (excluding equity and bonus):</p>\n<p>£76,500-£85,000 GBP</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_dafcc67c-e10","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Coinbase","sameAs":"https://www.coinbase.com/","logo":"https://logos.yubhub.co/coinbase.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coinbase/jobs/7554014","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"£76,500-£85,000 GBP","x-skills-required":["public accounting","fintech/financial services","regulated environment","accounting","blockchain technology","crypto economy","analytical skills","business partnering","problem solving","prioritization skills","communication skills","strong work ethic","team player"],"x-skills-preferred":["Netsuite","FloQast","G-Suite","SQL","Snowflake","Looker","Client Asset Safeguarding"],"datePosted":"2026-04-18T15:57:47.335Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - UK"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"public accounting, fintech/financial services, regulated environment, accounting, blockchain technology, crypto economy, analytical skills, business partnering, problem solving, prioritization skills, communication skills, strong work ethic, team player, Netsuite, FloQast, G-Suite, SQL, Snowflake, Looker, Client Asset Safeguarding","baseSalary":{"@type":"MonetaryAmount","currency":"GBP","value":{"@type":"QuantitativeValue","minValue":76500,"maxValue":85000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_f1950023-ef7"},"title":"Senior Engineering Manager, Activation","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Brex’s AI-native automation and world-class service eliminate manual expense and accounting tasks for customers so they can focus on what matters most. Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream. We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Engineering</p>\n<p>Engineering at Brex is about building systems that scale with speed and intention. Our teams span Software, Data, Security, and IT, and operate with high autonomy and deep collaboration. We tackle hard technical problems, own our outcomes, and push for excellence at every level , from architecture to deployment. It’s an environment where engineering is a craft, and builders become leaders.</p>\n<p>What you’ll do</p>\n<p>You will lead an engineering group focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, integrations, and implementation workflows that help customers realize value quickly. This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>\n<p>The ideal candidate is a seasoned engineering leader with experience scaling user-facing onboarding systems, delivering high-quality product experiences, and partnering deeply across Product, Design, Operations, and GTM teams.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our New York office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities</p>\n<ul>\n<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding and implementation experience.</li>\n</ul>\n<ul>\n<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>\n</ul>\n<ul>\n<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>\n</ul>\n<ul>\n<li>Drive execution of the Activation roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>\n</ul>\n<ul>\n<li>Lead and manage multiple teams of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>\n</ul>\n<ul>\n<li>Build systems that integrate identity verification, KYC and compliance workflows, customer data ingestion, and implementation tooling in a scalable and reliable manner.</li>\n</ul>\n<ul>\n<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>\n</ul>\n<ul>\n<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>\n</ul>\n<ul>\n<li>Strong technical background and understanding of software development principles.</li>\n</ul>\n<ul>\n<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>\n</ul>\n<ul>\n<li>Demonstrated track record of shipping customer-facing features across multiple release cycles.</li>\n</ul>\n<ul>\n<li>3+ years of experience managing or leading multiple technical teams in a high-growth environment.</li>\n</ul>\n<ul>\n<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>\n</ul>\n<ul>\n<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer facing experiences.</li>\n</ul>\n<ul>\n<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>\n</ul>\n<ul>\n<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>\n</ul>\n<p>Bonus points</p>\n<ul>\n<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>\n</ul>\n<ul>\n<li>You have started your own technology venture or were an early technical founder/employee. We value entrepreneurial spirit &amp; scrappiness!</li>\n</ul>\n<ul>\n<li>You are a champion for the customer and constantly put yourself in their shoes to create intuitive, frictionless experiences.</li>\n</ul>\n<p>Compensation</p>\n<p>The expected salary range for this role is $300,000 - $375,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_f1950023-ef7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8330492002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$300,000 - $375,000","x-skills-required":["Technical leadership","Software development principles","Full-stack engineering","Customer-facing features","Data-driven mindset","AI-powered product experiences","LLM-driven automation","Personalization"],"x-skills-preferred":["Data platforms","Snowflake","Hex","Entrepreneurial spirit","Scrappiness","Customer obsession"],"datePosted":"2026-04-18T15:57:39.757Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Technical leadership, Software development principles, Full-stack engineering, Customer-facing features, Data-driven mindset, AI-powered product experiences, LLM-driven automation, Personalization, Data platforms, Snowflake, Hex, Entrepreneurial spirit, Scrappiness, Customer obsession","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":300000,"maxValue":375000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_477d343e-e37"},"title":"Customer Success Architect","description":"<p>About Mixpanel</p>\n<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence. Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees.</p>\n<p>About the Customer Success Team:</p>\n<p>Mixpanel’s Customer Success &amp; Solutions Engineering teams are analytics consultants who embed themselves within our enterprise customer teams to drive our customers’ business outcomes. We work with prospects and customers throughout the customer journey to understand what drives value and serve as the technical counterpart to our Sales organization to deliver on that value.</p>\n<p>You will partner closely with Account Executives, Account Managers, Product, Engineering, and Support to successfully roll out self-serve analytics within our customers’ organizations, help the customer manage change, execute on technical projects and services that delight our customers, and ultimately drive ROI on the customer’s Mixpanel investment.</p>\n<p>About the Role:</p>\n<p>As a CSA, you will partner with customers throughout the customer journey to understand what drives value, beginning from the pre-sales running proof of concepts to demonstrate quick time to value, to post-sales onboarding and implementation, where you set customers up for long-term success with scalable implementation and data governance best practices. Throughout the entire customer lifecycle, you will work to understand how analytics can drive business value for your customers and will consult them on how to maximize the value of Mixpanel, including managing change during Mixpanel’s rollout, defining and achieving ROI, and identifying areas of improvement in their current usage of analytics.</p>\n<p>For large enterprise customers, post onboarding, you will also continue alongside the Account Managers to drive data trust and product adoption for 100+ end user teams through a change management rollout approach.</p>\n<p>Responsibilities:</p>\n<p>Serve as a trusted technical advisor for prospects/customers to provide strategic consultation on data architecture, governance, instrumentation, and business outcomes</p>\n<p>Effectively communicate at most levels of the customer’s organization to influence business outcomes via Mixpanel, design and execute a comprehensive analytics strategy, and unblock technical and organizational roadblocks</p>\n<p>Own the customer’s success with Mixpanel , documenting and delivering ROI to the customer throughout their journey to transform their business with self-serve analytics</p>\n<p>Own onboarding and data health for your assigned customers/projects, including ongoing enhancements to their data quality and overall tech stack integration</p>\n<p>Engage with customers’ engineering, product management, and marketing teams to handle technical onboarding, optimize Mixpanel deployments, and improve data trust</p>\n<p>Deliver a variety of technical services ranging from data architecture consultations to adoption and change management best practices</p>\n<p>Leverage modern data architecture expertise to create scalable data governance practices and data trust for our customers, including data optimization and re-implementation projects</p>\n<p>Successfully execute on success outcomes whilst balancing project timelines, scope creep, and unanticipated issues</p>\n<p>Bridge the technical-business gap with your customers , working with business stakeholders to define a strategic vision for Mixpanel and then working with the right business and technical contacts to execute that vision</p>\n<p>Collaborate with our technical and solutions partners as needed on data optimization and onboarding projects</p>\n<p>Be a technical sponsor for internal engagements with Mixpanel product and engineering teams to prioritize product and systems tasks from clients</p>\n<p>We&#39;re Looking For Someone Who Has</p>\n<p>3 to 5 years of experience consulting on defining and delivering ROI through new tool implementations</p>\n<p>Experience working with Director-level members of the customer organization to define a strategic vision and successfully leveraging those members to deliver on that vision</p>\n<p>The ability to communicate with stakeholders at most levels of an organization , from talking with developers about the ins and outs of an API to talking to a Director of Data Science/Product Management about organizational efficiency</p>\n<p>Can manage complex projects with assorted client stakeholders, working across teams and departments to execute real change</p>\n<p>Has a demonstrated successful record of experience in customer success, client-facing professional services, consulting, or technical project management role</p>\n<p>Excellent written, analytical, and communication skills</p>\n<p>Strong process and/or project delivery discipline</p>\n<p>Eager to learn new technologies and adapt to evolving customer needs</p>\n<p>We&#39;d Be Extra Excited For Someone Who Has</p>\n<p>Experience in data querying, modeling, and transforming in at least one core tool, including SQL / dbt / Python / Business Intelligence tools / Product Analytics tools, etc.</p>\n<p>Familiar with databases and cloud data warehouses like Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, etc.</p>\n<p>Familiar with product analytics implementation methods like SDKs, Customer Data Platforms (CDPs), Event Streaming, Reverse ETL, etc.</p>\n<p>Familiar with analytics best practices across business segments and verticals</p>\n<p>Benefits and Perks</p>\n<p>Comprehensive Medical, Vision, and Dental Care</p>\n<p>Mental Wellness Benefit</p>\n<p>Generous Vacation Policy &amp; Additional Company Holidays</p>\n<p>Enhanced Parental Leave</p>\n<p>Volunteer Time Off</p>\n<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>\n<p>Culture Values</p>\n<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>\n<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>\n<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>\n<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>\n<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>\n<p>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</p>\n<p>Why choose Mixpanel?</p>\n<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>\n<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>\n<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>\n<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>\n<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>\n<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>\n<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>\n<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, or any other protected characteristic.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_477d343e-e37","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mixpanel","sameAs":"https://mixpanel.com","logo":"https://logos.yubhub.co/mixpanel.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mixpanel/jobs/7506821","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data architecture","governance","instrumentation","business outcomes","data querying","modeling","transforming","SQL","dbt","Python","Business Intelligence tools","Product Analytics tools"],"x-skills-preferred":["databases","cloud data warehouses","Google Cloud","Amazon Redshift","Microsoft Azure","Snowflake","Databricks","SDKs","Customer Data Platforms","Event Streaming","Reverse ETL"],"datePosted":"2026-04-18T15:57:25.195Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru, India (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data architecture, governance, instrumentation, business outcomes, data querying, modeling, transforming, SQL, dbt, Python, Business Intelligence tools, Product Analytics tools, databases, cloud data warehouses, Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, SDKs, Customer Data Platforms, Event Streaming, Reverse ETL"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_03224784-9c2"},"title":"Senior Data Engineering Manager","description":"<p>Job Title: Senior Data Engineering Manager</p>\n<p>Location: Dublin, Ireland</p>\n<p>Department: R&amp;D</p>\n<p>Job Description:</p>\n<p>Intercom is seeking a Senior Data Engineering Manager to lead the design and evolution of the core infrastructure that powers our entire data ecosystem. As a leader, you will partner with product and business teams to drive key data initiatives and ensure the success of our data engineering team.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Next-Gen Platform Evolution: Partner with product and business teams to design and implement the next generation of our data stack, ensuring it can meet the demands of advanced analytics and AI applications.</li>\n</ul>\n<ul>\n<li>Enablement Through Tooling: Partner closely with Analytics Engineers, Analysts, and Data Scientists to build self-service tooling and infrastructure that enables them to move fast and deploy safely.</li>\n</ul>\n<ul>\n<li>Data Quality Guardianship: Implement advanced monitoring systems to proactively detect, surface, and resolve data quality issues across our high-throughput environment.</li>\n</ul>\n<ul>\n<li>Driving Automation: Develop automation and tooling that streamlines the creation and discovery of high-quality analytics data, making the entire data lifecycle more efficient.</li>\n</ul>\n<p>Strategic Impact You&#39;ll Drive:</p>\n<ul>\n<li>GTM Data Platform Strategy: Build the data acquisition strategy that will enable us to build the next generation of business-focused internal software.</li>\n</ul>\n<ul>\n<li>Conversational BI Strategy: Lead the charge to shift away from complex, technical reporting toward natural language interaction to make data truly democratized and accessible.</li>\n</ul>\n<ul>\n<li>Platform &amp; Warehousing Strategy: Lead the architectural- and cost review and revamp of our core data infrastructure to ensure it can scale exponentially for future growth and advanced use cases.</li>\n</ul>\n<p>Recent Wins You&#39;ll Build Upon:</p>\n<ul>\n<li>AI-assisted Local Analytics Development Environment for Airflow and DBT.</li>\n</ul>\n<ul>\n<li>Data-rich AI apps containerized on Snowflake SPCS.</li>\n</ul>\n<ul>\n<li>A new, modern data catalog solution.</li>\n</ul>\n<ul>\n<li>Migrating critical MySQL ingestion pipelines from Aurora to PlanetScale.</li>\n</ul>\n<p>Who You Are:</p>\n<ul>\n<li>A leader, a builder, and a problem-solver who thrives on solving real-world business problems.</li>\n</ul>\n<ul>\n<li>7+ years of experience in the data space, leading teams of 6+ engineers.</li>\n</ul>\n<ul>\n<li>Stakeholder focus: ability to communicate complex technical solutions to a business-focused audience and vice versa.</li>\n</ul>\n<ul>\n<li>Technical depth: not afraid to get hands dirty and write code when needed.</li>\n</ul>\n<ul>\n<li>A leader and mentor: naturally recognizes opportunities to step back and mentor others.</li>\n</ul>\n<p>Bonus Points (Our Modern Stack Knowledge):</p>\n<ul>\n<li>Airflow at scale: extensive experience working with Apache Airflow, especially the nuances of operating it reliably in a high-volume environment.</li>\n</ul>\n<ul>\n<li>Modern data stack fluency: familiarity with tools like Snowflake and DBT.</li>\n</ul>\n<ul>\n<li>Future-focused: keeps a keen eye on industry trends and emerging technologies.</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Competitive salary and equity in a fast-growing start-up.</li>\n</ul>\n<ul>\n<li>We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen.</li>\n</ul>\n<ul>\n<li>Regular compensation reviews - we reward great work!</li>\n</ul>\n<ul>\n<li>Pension scheme &amp; match up to 4%.</li>\n</ul>\n<ul>\n<li>Peace of mind with life assurance, as well as comprehensive health and dental insurance for you and your dependents.</li>\n</ul>\n<ul>\n<li>Open vacation policy and flexible holidays so you can take time off when you need it.</li>\n</ul>\n<ul>\n<li>Paid maternity leave, as well as 6 weeks paternity leave for fathers, to let you spend valuable time with your loved ones.</li>\n</ul>\n<ul>\n<li>If you’re cycling, we’ve got you covered on the Cycle-to-Work Scheme. With secure bike storage too.</li>\n</ul>\n<ul>\n<li>MacBooks are our standard, but we also offer Windows for certain roles when needed.</li>\n</ul>\n<p>Policies:</p>\n<ul>\n<li>Intercom has a hybrid working policy. We believe that working in person helps us stay connected, collaborate easier and create a great culture while still providing flexibility to work from home.</li>\n</ul>\n<ul>\n<li>We have a radically open and accepting culture at Intercom. We avoid spending time on divisive subjects to foster a safe and cohesive work environment for everyone.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_03224784-9c2","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Intercom","sameAs":"https://www.intercom.com/","logo":"https://logos.yubhub.co/intercom.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/intercom/jobs/7574762","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Airflow","Apache Airflow","DBT","Snowflake","Data Engineering","Data Science","Analytics","Data Management","Data Quality","Automation","Cloud Computing","Data Warehousing","Big Data","Machine Learning","Artificial Intelligence"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:57:06.635Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dublin, Ireland"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Airflow, Apache Airflow, DBT, Snowflake, Data Engineering, Data Science, Analytics, Data Management, Data Quality, Automation, Cloud Computing, Data Warehousing, Big Data, Machine Learning, Artificial Intelligence"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_35458586-a42"},"title":"Enterprise Architect, Finance & Legal Systems","description":"<p>We are seeking an experienced Enterprise Architect to join our Technology, Data and Intelligence team. As an Enterprise Architect, you will be responsible for defining and delivering the technology architecture strategy across Finance and Legal functions, enabling data-driven decision-making, automation, and operational excellence.</p>\n<p>Key responsibilities will include:</p>\n<ul>\n<li>Defining the target-state architecture for Finance and Legal applications, ensuring alignment with enterprise strategy and growth objectives.</li>\n<li>Leading the design and implementation of end-to-end architectural solutions for Finance and Legal systems, ensuring integration, scalability, and performance across the enterprise.</li>\n<li>Developing and maintaining a multi-year roadmap for modernization across ERP, FP&amp;A, Legal, and Sales Compensation systems.</li>\n<li>Ensuring systems are designed with identity-first security principles, integrating with Okta and other IAM solutions for authentication, authorization, and compliance.</li>\n</ul>\n<p>The ideal candidate will have:</p>\n<ul>\n<li>15+ years of software engineering experience, including significant time as an Architect or Principal in ERP Systems (Oracle/Netsuite/SAP), FP&amp;A Systems (Anaplan) and/or CLM systems (Aptus/Conga/Ironclad).</li>\n<li>Excellent storytelling and communication skills,comfortable presenting to both technical and executive stakeholders.</li>\n<li>Multiple ERP (Oracle or Netsuite) full cycle implementation experience.</li>\n<li>Deep understanding of the Finance business process areas – Order to Cash, Record to Report, Source to Pay, Plan to Report (FP&amp;A), Treasury, Credit Collection, Revenue Recognition, and Subscription Billing, Contract Life Cycle Mgmt within Legal Ops.</li>\n<li>Demonstrated hands-on experience architecting functional and technical solutions within major business applications, with specific expertise in NetSuite (or Oracle), Aptus/Conga (or IronClad), Anaplan, Coupa, Scout, Tax engines such as Avalara, Vertex or OneSource – including understanding their data models and APIs in context of solution development and integrations.</li>\n<li>Architected and delivered AI Agents using leading LLMs Gemini, OpenAI or Claude.</li>\n<li>Experience with managing a Software and/or Vendor selection keeping in view the end state architecture of the enterprise.</li>\n<li>Proficient understanding of middlewares such as MuleSoft, Workato, Boomi, or Informatica for connecting Finance, Legal, CRM, and data platforms.</li>\n<li>Familiar with code, configuration, and system performance standards/reviews to ensure quality, scalability, and compliance with enterprise standards.</li>\n<li>Proficiency with AWS, Azure, or GCP, with knowledge of data lakes/warehouses (Snowflake, Redshift, BigQuery) for SaaS revenue and compliance analytics.</li>\n<li>Identity &amp; Security: knowledge of SSO, OAuth, SAML, SCIM, and Zero Trust principles, with hands-on integration experience in Okta or similar IAM platforms.</li>\n</ul>\n<p>In addition to the above skills and experience, the ideal candidate will be passionate about innovation, AI adoption, and continuous improvement aligned with Okta’s mission to build secure, intelligent, and connected business systems.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_35458586-a42","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7442186","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$150,000 - $250,000 per year","x-skills-required":["Enterprise Architecture","Cloud Computing","Identity and Access Management","Security","Data Analytics","Machine Learning","Artificial Intelligence","Software Development","DevOps","Agile Methodologies"],"x-skills-preferred":["AWS","Azure","GCP","Snowflake","Redshift","BigQuery","MuleSoft","Workato","Boomi","Informatica"],"datePosted":"2026-04-18T15:56:59.822Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Enterprise Architecture, Cloud Computing, Identity and Access Management, Security, Data Analytics, Machine Learning, Artificial Intelligence, Software Development, DevOps, Agile Methodologies, AWS, Azure, GCP, Snowflake, Redshift, BigQuery, MuleSoft, Workato, Boomi, Informatica","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":150000,"maxValue":250000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_aeba45bc-3e4"},"title":"Senior Solutions Engineer","description":"<p>About Mixpanel</p>\n<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>\n<p>Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees. Mixpanel delivers insights that customers trust.</p>\n<p>Visit mixpanel.com to learn more.</p>\n<p>About the Customer Success &amp; Solutions Engineering Team</p>\n<p>Mixpanel’s Customer Success &amp; Solutions Engineering teams are analytics consultants who embed themselves within our enterprise customer teams to drive our customer’s business outcomes. We work with prospects and customers throughout the customer journey to understand what drives value and serve as the technical counterpart to our Sales organization to deliver on that value.</p>\n<p>You will partner closely with Account Executives, Account Managers, Product, Engineering, and Support to successfully roll out self-serve analytics within our customer’s organizations, help the customer manage change, execute on technical projects and services that delight our customers and ultimately drive ROI on the customer’s Mixpanel investment.</p>\n<p>About the Role</p>\n<p>Our SEs are inquisitive, nimble, and able to clearly articulate the technical benefits and requirements of Mixpanel to developers and product managers, while also communicating the business value of our product to high-level executives. In your first month, you’ll become a Mixpanel expert,both in features and functionality as well as implementation. You’ll have the opportunity to shadow customer calls and demos with current Sales Engineers and Account Executives while learning to articulate our value proposition. You’ll also be trained on Mixpanel’s internal systems and tools to set you up for success.</p>\n<p>Within your first three months, you’ll be directly involved in deal cycles with Commercial Account Executives. You’ll lead the technical qualification for customer use cases and deliver customized demos for prospects. You’ll work directly with leadership at the prospect’s organization to understand business challenges that can be solved through an analytics platform and consult on how Mixpanel can address those challenges to achieve a strong ROI. You’ll also work with the prospect’s business and technical teams to scope and execute proof-of-concept projects to establish Mixpanel’s value,including consulting on data ingestion methods, overall architecture, success criteria, and rollout strategies for analytics tools across an organization.</p>\n<p>Responsibilities</p>\n<p>Serve as a trusted technical advisor for prospects, providing strategic consultation on data architecture, governance, instrumentation, and business outcomes.</p>\n<p>Communicate and consult effectively at all levels of the customer’s organization to earn trust and influence buying decisions.</p>\n<p>Bridge the technical-business gap,working with senior stakeholders to define success for proof-of-concepts and ensuring successful execution and outcomes.</p>\n<p>Leverage your Mixpanel expertise and technical/consultative skills to impart best practices throughout proof-of-concept projects.</p>\n<p>Partner with Account Executives to drive revenue growth, serving as the key technical contact for customers.</p>\n<p>Partner with post-sales teams to ensure that pre-sales value propositions translate into tangible post-sales results.</p>\n<p>Develop relationships and uncover the needs of key technical stakeholders within your assigned book of business.</p>\n<p>Be the “Voice of the Prospect” by collecting feedback from potential Mixpanel customers and sharing it with the Product team.</p>\n<p>We&#39;re Looking For Someone Who Has</p>\n<p>The ability to communicate with stakeholders at all levels,from discussing APIs with developers to organizational efficiency with CIOs.</p>\n<p>A demonstrated track record of qualifying and selling technical solutions to executive stakeholders.</p>\n<p>6+ years of experience in a Software-as-a-Service Sales Engineering or related role.</p>\n<p>Experience in data querying, modeling, and transformation using tools such as SQL, dbt, Python, Business Intelligence platforms, or Product Analytics tools.</p>\n<p>Familiarity with databases and cloud data warehouses (e.g., Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks).</p>\n<p>A successful record of experience in sales engineering, customer success, client-facing professional services, consulting, or technical project management.</p>\n<p>Excellent written, analytical, communication, and presentation skills.</p>\n<p>Strong process and project delivery discipline.</p>\n<p>The ability to travel.</p>\n<p>Fluency in multiple languages; German preferred.</p>\n<p>Benefits and Perks</p>\n<p>Comprehensive Medical, Vision, and Dental Care</p>\n<p>Mental Wellness Benefit</p>\n<p>Generous Vacation Policy &amp; Additional Company Holidays</p>\n<p>Enhanced Parental Leave</p>\n<p>Volunteer Time Off</p>\n<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>\n<p>Culture Values</p>\n<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>\n<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>\n<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>\n<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>\n<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>\n<p>Why choose Mixpanel?</p>\n<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>\n<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>\n<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>\n<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>\n<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>\n<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>\n<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>\n<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, veteran status, or disability status.</p>\n<p>Pursuant to the San Francisco Fair Chance Ordinance or other similar laws that may be applicable, we will consider for employment qualified applicants with arrest and conviction records.</p>\n<p>We’ve immersed ourselves in our Culture and Values as our guiding principles for the impact we want to have and the future we are building.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_aeba45bc-3e4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mixpanel","sameAs":"https://mixpanel.com","logo":"https://logos.yubhub.co/mixpanel.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mixpanel/jobs/7407407","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","dbt","Python","Business Intelligence platforms","Product Analytics tools","Databases","Cloud data warehouses","Google Cloud","Amazon Redshift","Microsoft Azure","Snowflake","Databricks"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:56:33.243Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, UK (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, dbt, Python, Business Intelligence platforms, Product Analytics tools, Databases, Cloud data warehouses, Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0a154c39-08a"},"title":"Senior Machine Learning Platform Engineer (Platform)","description":"<p>Ready to be pushed beyond what you think you’re capable of?</p>\n<p>At Coinbase, our mission is to increase economic freedom in the world.</p>\n<p>We&#39;re seeking a Senior Machine Learning Platform Engineer to join our Machine Learning Platform team. The team builds the foundational components for feature engineering and training/serving ML models at Coinbase. Our platform is used to combat fraud, personalize user experiences, and to analyze blockchains.</p>\n<p>As a Senior Machine Learning Platform Engineer, you will:</p>\n<p>Form a deep understanding of our Machine Learning Engineers’ needs and our current capabilities and gaps. Mentor our talented junior engineers on how to build high quality software, and take their skills to the next level. Continually raise our engineering standards to maintain high-availability and low-latency for our ML inference infrastructure that runs both predictive ML models and LLMs. Optimize low latency streaming pipelines to give our ML models the freshest and highest quality data. Evangelize state-of-the-art practices on building high-performance distributed training jobs that process large volumes of data. Build tooling to observe the quality of data going into our models and to detect degradations impacting model performance.</p>\n<p>What we look for in you:</p>\n<p>5+ yrs of industry experience as a Software Engineer. Strong understanding of distributed systems. Lead by example through high quality code and excellent communication skills. Great sense of design, and can bring clarity to complex technical requirements. Treat other engineers as a customer, and have an obsessive focus on delivering them a seamless experience. Mastery of the fundamentals, such that you can quickly jump between many varied technologies and still operate at a high level. Demonstrates the ability to responsibly use generative AI tools and copilots (e.g., LibreChat, Gemini, Glean) in daily workflows, continuously learn as tools evolve, and apply human-in-the-loop practices to deliver business-ready outputs and drive measurable improvements in efficiency, cost, and quality.</p>\n<p>Nice to haves:</p>\n<p>Experience building ML models and working with ML systems. Experience working on a platform team, and building developer tooling. Experience with the technologies we use (Python, Golang, Ray, Tecton, Spark, Airflow, Databricks, Snowflake, and DynamoDB).</p>\n<p>Job ID: P75535</p>\n<p>Pay Transparency Notice: Depending on your work location, the target annual base salary for this position can range as detailed below. Total compensation may also include equity and bonus eligibility and benefits (including medical, dental, vision and 401(k)). Annual base salary range (excluding equity and bonus): $186,065-$225,000 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0a154c39-08a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Coinbase","sameAs":"https://www.coinbase.com/","logo":"https://logos.yubhub.co/coinbase.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coinbase/jobs/7604203","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$186,065-$225,000 USD","x-skills-required":["distributed systems","high-quality code","excellent communication skills","design","fundamentals","generative AI tools","copilots"],"x-skills-preferred":["ML models","ML systems","platform team","developer tooling","Python","Golang","Ray","Tecton","Spark","Airflow","Databricks","Snowflake","DynamoDB"],"datePosted":"2026-04-18T15:56:24.447Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - USA"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"distributed systems, high-quality code, excellent communication skills, design, fundamentals, generative AI tools, copilots, ML models, ML systems, platform team, developer tooling, Python, Golang, Ray, Tecton, Spark, Airflow, Databricks, Snowflake, DynamoDB","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":186065,"maxValue":225000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_84875ccd-0a5"},"title":"Senior Partner Marketing Manager","description":"<p>As a Senior Partner Marketing Manager, you&#39;ll play a critical role in shaping and executing co-marketing initiatives with some of our most important technology partners globally.</p>\n<p>Your work will amplify the reach and impact of dbt across the modern data stack, helping to elevate our brand and drive growth through the ecosystem.</p>\n<p>This is a highly cross-functional role where strategic thinking, creativity, and strong collaboration skills will be key to success.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Lead the creation and execution of marketing campaigns, programs, events, and activities with strategic technology partners.</li>\n<li>Collaborate closely with the Revenue Marketing, Partnerships, and Product Marketing teams to ensure GTM partner plans align with dbt Labs&#39; broader business goals.</li>\n<li>Build and nurture relationships with marketing counterparts at key partners like Snowflake, Google, AWS, Microsoft, and Databricks to align on co-marketing efforts and shared objectives.</li>\n<li>Clearly articulate the value of dbt to partners and support them in promoting the platform internally and to their customer base.</li>\n<li>Own the development of joint messaging and co-branded assets,including blogs, webinars, solution briefs, and presentation decks,ensuring alignment and consistency across all public-facing content.</li>\n<li>Create internal enablement materials to educate and empower sales teams to leverage partner campaigns and initiatives.</li>\n<li>Gain a deep understanding of partner business strategies and priorities; design co-marketing programs that provide mutual value.</li>\n<li>Set and manage OKRs, track program performance, and deliver quarterly reviews with partners to assess impact, identify opportunities, and ensure strategic alignment.</li>\n<li>Develop annual GTM marketing plans tailored to individual partners, accounting for geographic and vertical-specific nuances.</li>\n<li>Exercise strategic judgment in deciding which partner activities to pursue and how best to allocate time and resources for maximum impact.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>8+ years of experience in B2B marketing or similar, particularly within the data or software industry, with a strong track record of building and executing successful partner marketing programs.</li>\n<li>A &#39;builder&#39; mindset: you enjoy solving problems, creating structure where none exists, and working cross-functionally to drive measurable outcomes.</li>\n<li>Deep familiarity with the modern data ecosystem and players such as Snowflake, Google, AWS, Microsoft, and Databricks.</li>\n<li>The ability to navigate complex partner organizations and manage relationships with multiple stakeholders across competing interests.</li>\n<li>Strong storytelling and positioning skills,you know how to distill joint value propositions into compelling messaging and content.</li>\n<li>Comfort operating in a fast-paced, dynamic environment with high levels of ambiguity.</li>\n<li>A broad understanding of integrated marketing strategies, including digital campaigns, field marketing, and industry events.</li>\n<li>Exceptional communication skills, including concise writing and confident presentation abilities, especially with senior stakeholders.</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Experience working asynchronously within a remote, distributed team.</li>\n<li>Prior experience working for or closely with any of dbt Labs&#39; strategic partners.</li>\n<li>Familiarity with the role dbt Labs plays in the cloud data warehouse ecosystem and the modern data stack.</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Unlimited vacation time with a culture that actively encourages time off</li>\n<li>401k plan with 3% guaranteed company contribution</li>\n<li>Comprehensive healthcare coverage</li>\n<li>Generous paid parental leave</li>\n<li>Flexible stipends for:</li>\n<li>Health &amp; Wellness</li>\n<li>Home Office Setup</li>\n<li>Cell Phone &amp; Internet</li>\n<li>Learning &amp; Development</li>\n<li>Office Space</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>We offer competitive compensation packages commensurate with experience, including salary, RSUs, and where applicable, performance-based pay. Our Talent Acquisition Team can answer questions around dbt Labs total rewards during your interview process.</p>\n<p>In select locations (including Austin, Boston, Chicago, Denver, Los Angeles, Philadelphia, New York City, San Francisco, Washington, DC, and Seattle), an alternate range may apply, as specified below.</p>\n<ul>\n<li>The typical starting salary range for this role is: $132,000-$188,700</li>\n<li>The typical starting salary range for this role in the select locations listed is: $147,000-209,000</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_84875ccd-0a5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673163005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$132,000-$188,700","x-skills-required":["B2B marketing","Partner marketing","Digital campaigns","Field marketing","Industry events","Cloud data warehouse ecosystem","Modern data stack","Data engineering","Analytics engineering","Snowflake","Google","AWS","Microsoft","Databricks"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:55:34.081Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Marketing","industry":"Technology","skills":"B2B marketing, Partner marketing, Digital campaigns, Field marketing, Industry events, Cloud data warehouse ecosystem, Modern data stack, Data engineering, Analytics engineering, Snowflake, Google, AWS, Microsoft, Databricks","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":132000,"maxValue":188700,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1f2f48ad-46d"},"title":"Senior Analytics Engineer","description":"<p>We&#39;re looking for a dedicated Analytics Engineer to join the AI Group to help us with data platform development, cross-functional collaboration, data strategy &amp; governance, advanced analytics &amp; insights, automation &amp; optimization, innovation in data infrastructure, and strategic influence.</p>\n<p>As an Analytics Engineer, you will design, build, and manage scalable data pipelines and ETL processes to support a robust, analytics-ready data platform. You will partner with AI analysts, ML scientists, engineers, and business teams to understand data needs and ensure accurate, reliable, and ergonomic data solutions. You will lead initiatives in data model development, data quality ownership, warehouse management, and production support for critical workflows. You will conduct data analysis and build custom models to support strategic business decisions and performance measurement. You will streamline data collection and reporting processes to reduce manual effort and improve efficiency. You will create scalable solutions like unified data pipelines and access control systems to meet evolving organisational needs. You will work with partner teams to align data collection with long-term analytics and feature development goals.</p>\n<p>We&#39;re looking for someone who writes advanced SQL with a preference for well-architected data models, optimized query performance, and clearly documented code. You should be familiar with the modern data stack, including dbt and Snowflake. You should have a growth mindset and eagerness to learn. You should exhibit great judgment and sharp business and product instincts that allow you to differentiate essential versus nice-to-have and to make good choices about trade-offs. You should practice excellent communication skills, and you should tailor explanations of technical concepts to a variety of audiences.</p>\n<p>Nice to have: exposure to Apache Airflow or other DAG frameworks, worked in Tableau, Looker, or similar visualization/business intelligence platform, experience with operational tools and business systems like Google Analytics, Marketo, Salesforce, Segment, or Stripe, familiarity with Python.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1f2f48ad-46d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Intercom","sameAs":"https://www.intercom.com/","logo":"https://logos.yubhub.co/intercom.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/intercom/jobs/7807847","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["advanced SQL","dbt","Snowflake","data pipeline development","ETL process management","data strategy & governance","advanced analytics & insights","automation & optimization","innovation in data infrastructure","strategic influence"],"x-skills-preferred":["Apache Airflow","Tableau","Looker","Google Analytics","Marketo","Salesforce","Segment","Stripe","Python"],"datePosted":"2026-04-18T15:55:10.503Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dublin, Ireland"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"advanced SQL, dbt, Snowflake, data pipeline development, ETL process management, data strategy & governance, advanced analytics & insights, automation & optimization, innovation in data infrastructure, strategic influence, Apache Airflow, Tableau, Looker, Google Analytics, Marketo, Salesforce, Segment, Stripe, Python"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0d42a662-43e"},"title":"Senior Product Manager - Identity","description":"<p>We&#39;re seeking a Senior Product Manager, Identity to own the intelligence layer that connects digital signals to real-world business outcomes. As a Senior PM, you&#39;ll work at the intersection of platform engineering, data science, and customer value creation. You&#39;ll drive systematic improvements to visitor-to-person and IP-to-company match rates, integrate new data sources, and ensure identity data flows seamlessly across ZoomInfo&#39;s GTM Intelligence Platform.</p>\n<p>Your work directly impacts how thousands of customers identify, engage, and convert their ideal buyers. You&#39;ll partner with product marketing, sales enablement, and customer success to position identity capabilities and drive adoption.</p>\n<p>Key responsibilities include:</p>\n<p>Strategic Leadership: \tOwn product strategy and roadmap for identity resolution, defining vision and success metrics that drive measurable improvements in match rates and customer value \tIdentify and prioritize opportunities to enhance identity resolution through new data sources, improved algorithms, and better data quality \tEvaluate, negotiate, and integrate 3rd party data partnerships that expand identity coverage and accuracy \tDefine and track core metrics including visitor-to-person match rates, IP-to-company accuracy, coverage, latency, and downstream revenue impact \tPartner with product marketing, sales enablement, and customer success to position identity capabilities and drive adoption</p>\n<p>Execution Excellence: \tDeliver measurable improvements to identity resolution performance through rigorous prioritization, rapid experimentation, and iterative execution \tDiagnose and resolve data quality issues that impact match rates, customer trust, and product experience \tBuild scalable monitoring and alerting systems that track identity resolution health and enable proactive issue detection \tCreate frameworks for evaluating new data sources and measuring their incremental impact on resolution performance \tConduct deep discovery using data analysis, customer interviews, and competitive research to uncover opportunities for improvement</p>\n<p>Cross-Functional Partnership: \tPartner with Platform, Application, and Data teams to ensure seamless integration of identity data into customer-facing products \tCollaborate with data science and engineering to solve complex problems in entity resolution, probabilistic matching, and signal enrichment \tWork with analytics teams to instrument metrics, build dashboards, and enable data-driven decision making \tEnable go-to-market teams through training, documentation, and competitive positioning</p>\n<p>What You Bring:</p>\n<p>Required Qualifications: \t5+ years of product management experience in data-intensive B2B SaaS environments, with focus on data products, identity systems, or platform-level capabilities \tDeep understanding of identity resolution, entity matching, IP-to-company mapping, visitor identification, and the evolving privacy landscape \tTechnical fluency with SQL, Snowflake, and BI tools (Tableau, Looker) - ability to independently analyze data, diagnose issues, and make data-driven decisions \tProven track record identifying, diagnosing, and resolving data quality issues at scale with understanding of coverage, accuracy, and freshness trade-offs \tCross-functional leadership - demonstrated ability to influence and align engineering, product, data science, and go-to-market teams without formal authority \tCustomer-centric mindset with experience optimizing adoption funnels and driving engagement for technical products \tExcellent communication skills - ability to translate complex technical concepts into clear, value-based narratives for executives and customers \tBias for action - track record of moving quickly from insight to execution with iterative, test-and-learn approach \tB2B SaaS experience with enterprise data products, platform products, or GTM tools</p>\n<p>Preferred Qualifications: \tExperience with AdTech, MarTech, or Real-Time Bidding (RTB) ecosystems \tBackground in data science, engineering, or analytics prior to product management \tFamiliarity with 3rd party data sourcing, evaluation, and integration \tExperience with probabilistic matching, entity resolution, or graph-based identity systems \tKnowledge of data privacy regulations and their product implications \tExperience in sales intelligence, marketing technology, or revenue operations domains</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0d42a662-43e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"ZoomInfo","sameAs":"https://www.zoominfo.com/","logo":"https://logos.yubhub.co/zoominfo.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/zoominfo/jobs/8477058002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$126,000-$198,000 USD","x-skills-required":["SQL","Snowflake","BI tools (Tableau, Looker)","Identity resolution","Entity matching","IP-to-company mapping","Visitor identification","Data quality","Data science","Engineering","Analytics"],"x-skills-preferred":["AdTech","MarTech","Real-Time Bidding (RTB) ecosystems","Probabilistic matching","Entity resolution","Graph-based identity systems","Data privacy regulations","Sales intelligence","Marketing technology","Revenue operations domains"],"datePosted":"2026-04-18T15:54:57.461Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bethesda, Maryland, United States; Remote; Vancouver, Washington, United States; Waltham, Massachusetts, United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Snowflake, BI tools (Tableau, Looker), Identity resolution, Entity matching, IP-to-company mapping, Visitor identification, Data quality, Data science, Engineering, Analytics, AdTech, MarTech, Real-Time Bidding (RTB) ecosystems, Probabilistic matching, Entity resolution, Graph-based identity systems, Data privacy regulations, Sales intelligence, Marketing technology, Revenue operations domains","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":126000,"maxValue":198000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d0db29d2-16c"},"title":"Senior Data Quality Analyst","description":"<p>We are seeking a Senior Data Quality Analyst to join our Data Products team. As a Senior Data Quality Analyst, you will be responsible for ensuring the accuracy and quality of our data outputs. This includes designing and running pre/post-release comparisons across key attributes, identifying and documenting issues missed by automated tests, and making disposition recommendations for each issue.</p>\n<p>In addition to data output validation, you will also be responsible for bug investigation and root cause analysis. This will involve reviewing and prioritizing DPQ Jira issues, querying Snowflake to trace anomalies to source, and producing clear reports outlining the issue, evidence, likely cause, and next steps for both technical and non-technical audiences.</p>\n<p>You will also be responsible for preparing the weekly publication review package, which includes compiling a weekly record of which data pipelines ran, their completion status, any anomalies in run time or output volume, and a comparison against expected behavior.</p>\n<p>The ideal candidate will have 6+ years of experience in data quality, data analysis, or analytics engineering, preferably in healthcare or a related field. They will also have strong SQL skills, experience with Snowflake, and the ability to work through ambiguous issues from signal to root cause.</p>\n<p>This is a senior-level role that requires a high degree of autonomy and independence. The successful candidate will be able to work effectively in a fast-paced environment and prioritize multiple tasks and projects simultaneously.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d0db29d2-16c","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Komodo Health","sameAs":"https://www.komodohealth.com/","logo":"https://logos.yubhub.co/komodohealth.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/komodohealth/jobs/8476216002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data quality","data analysis","analytics engineering","SQL","Snowflake","bug investigation","root cause analysis"],"x-skills-preferred":["Python","Gemini","Claude","Cursor"],"datePosted":"2026-04-18T15:54:47.018Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, NY"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"data quality, data analysis, analytics engineering, SQL, Snowflake, bug investigation, root cause analysis, Python, Gemini, Claude, Cursor"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c6d97a39-7f0"},"title":"Senior Engineering Manager, Activation","description":"<p>Job Title: Senior Engineering Manager, Activation</p>\n<p>Join us at Brex, the intelligent finance platform that enables companies to spend smarter and move faster. We&#39;re looking for a seasoned engineering leader to lead our engineering group focused on building the systems and product experiences that power customer activation at Brex.</p>\n<p>As a Senior Engineering Manager, you will be responsible for driving business and product strategies, collaborating with cross-functional partners, leveraging AI to reimagine and automate onboarding and implementation workflows, and leading and managing multiple teams of engineers.</p>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s or Master&#39;s degree in Computer Science, Engineering, or a related field</li>\n<li>Strong technical background and understanding of software development principles</li>\n<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences</li>\n<li>Demonstrated track record of shipping customer-facing features across multiple release cycles</li>\n<li>3+ years of experience managing or leading multiple technical teams in a high-growth environment</li>\n</ul>\n<p>Bonus points:</p>\n<ul>\n<li>Experience with data platforms such as Snowflake, Hex, or similar</li>\n<li>You have started your own technology venture or were an early technical founder/employee</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $300,000 - $375,000. However, the starting base pay will depend on a number of factors including the candidate&#39;s location, skills, experience, market demands, and internal pay parity.</p>\n<p>If you&#39;re a champion for the customer and constantly put yourself in their shoes to create intuitive, frictionless experiences, we&#39;d love to hear from you!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c6d97a39-7f0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8330487002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$300,000 - $375,000","x-skills-required":["Leadership","Software Development","AI","Data Platforms","Full-Stack Engineering"],"x-skills-preferred":["Snowflake","Hex"],"datePosted":"2026-04-18T15:54:31.577Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Leadership, Software Development, AI, Data Platforms, Full-Stack Engineering, Snowflake, Hex","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":300000,"maxValue":375000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_49bbf204-f34"},"title":"Product Operations Associate","description":"<p><strong>Job Title: Product Operations Associate</strong></p>\n<p><strong>Company: Brex</strong></p>\n<p>Join Brex, the intelligent finance platform that enables companies to spend smarter and move faster in over 200 markets.</p>\n<p>As a Product Operations Associate, you will serve as the operational partner within a technical product domain, developing deep subject matter expertise and owning the operational readiness of initiatives from early access through GA.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Act as the primary operational partner to Product, Engineering, and Design within your domain</li>\n<li>Embed launch enablement into planning and release processes</li>\n<li>Establish and maintain delivery discipline, dependency visibility, and risk tracking</li>\n<li>Define and track success metrics in partnership with Product and Data</li>\n<li>Drive structured communication and alignment across internal stakeholders</li>\n<li>Ensure documentation standards and release communications are clear and scalable</li>\n<li>Surface cross-functional impacts and escalate risks as needed</li>\n<li>Identify operational gaps and implement durable process improvements</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>2+ years of experience in program management, product operations, business operations, consulting, or a related field</li>\n<li>Experience participating in or managing cross-functional projects and process improvements</li>\n<li>Proven ability to develop and maintain strong cross-functional relationships across Product, Engineering, Design, and operational teams</li>\n<li>Proficiency in defining, interpreting, and communicating key metrics that drive business success</li>\n<li>Experience establishing and maintaining structured operating frameworks, including product launches throughout the lifecycle</li>\n<li>Ability to balance competing priorities and manage time effectively in a fast-paced environment</li>\n<li>Strong written and verbal communication skills, with the ability to translate complex or technical material to both internal and customer-facing audiences</li>\n<li>High ownership mindset, strong attention to detail, and ability to manage data with a high degree of accuracy</li>\n<li>Self-motivated with demonstrated critical thinking and problem-solving capabilities</li>\n<li>Familiarity with tools such as Snowflake, Looker, Jira, or similar data and workflow platforms</li>\n</ul>\n<p><strong>Bonus Points</strong></p>\n<ul>\n<li>Experience supporting product launches from early development through GA</li>\n<li>Background in fintech, payments, or other regulated industries</li>\n<li>Experience partnering closely with customer-facing teams such as GTM or Support</li>\n<li>Interest in improving operational tooling or documentation systems</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>The expected salary range for this role is $81,600 - $102,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_49bbf204-f34","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8438565002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$81,600 - $102,000","x-skills-required":["Program Management","Product Operations","Business Operations","Consulting","Cross-functional Projects","Process Improvements","Structured Operating Frameworks","Product Launches","Data Management","Communication","Critical Thinking","Problem-Solving","Snowflake","Looker","Jira"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:54:15.292Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Vancouver, British Columbia, Canada"}},"employmentType":"FULL_TIME","occupationalCategory":"Operations","industry":"Finance","skills":"Program Management, Product Operations, Business Operations, Consulting, Cross-functional Projects, Process Improvements, Structured Operating Frameworks, Product Launches, Data Management, Communication, Critical Thinking, Problem-Solving, Snowflake, Looker, Jira","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":81600,"maxValue":102000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ed072f2b-181"},"title":"Staff Accountant","description":"<p>We are seeking a Staff Accountant to join our Accounting team. As a Staff Accountant, you will be responsible for operational and corporate accounting responsibilities, including journal entries, accruals, reconciliations, and monthly close activities. Our ideal candidate will have experience and a desire to work in a fast-paced, dynamic environment.</p>\n<p>This is a hybrid (3x per week) opportunity out of our corporate office in downtown Austin, TX.</p>\n<p>As a member of our Accounting team, you will work directly with the Accounting Manager and Sr. Accountants during the month-end close cycle, leverage your technical software skills to optimize accounting processes and support financial reporting, ensure department &amp; vertical alignment across financial reporting systems and other organizational software systems, provide fluctuation analysis and insights to Accounting leadership for forecasting and financial reporting, and assist the team in developing and maintaining timely and accurate financial statements and reports in accordance with generally accepted accounting principles (GAAP).</p>\n<p>In addition, you will ensure financial records are in compliance with company policies and procedures, assist in the ongoing process of upskilling the accounting team processes and controls to introduce automation and technology tools, and meet with other company stakeholders to complete tasks as needed.</p>\n<p>To be successful in this role, you will need to have 2-6 years of hands-on accounting experience, a Bachelor&#39;s degree in Accounting, Business or Finance, basic operational knowledge of U.S. GAAP, strong technical skills with proficiency in accounting ERP software (e.g., NetSuite, etc.), and other financial reporting tools, and excellent analytical and problem-solving skills.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ed072f2b-181","directApply":true,"hiringOrganization":{"@type":"Organization","name":"RigUp","sameAs":"https://rigup.com","logo":"https://logos.yubhub.co/rigup.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/rigup/jobs/7533482003","x-work-arrangement":"hybrid","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["accounting","financial reporting","U.S. GAAP","NetSuite","ERP software","Excel","Google Docs","Microsoft Word","PowerPoint"],"x-skills-preferred":["RPA","automation software","Snowflake virtual data warehouse"],"datePosted":"2026-04-18T15:54:11.815Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Austin, Texas"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Energy","skills":"accounting, financial reporting, U.S. GAAP, NetSuite, ERP software, Excel, Google Docs, Microsoft Word, PowerPoint, RPA, automation software, Snowflake virtual data warehouse"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_58a44dab-91a"},"title":"Partner Solutions Architect - Japan","description":"<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across Japan. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>You will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud.</p>\n<p>Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing. This is not a purely reactive enablement role. The Partner SA is expected to help shape and execute repeatable partner plays that create revenue.</p>\n<p>That includes enabling partner sellers and architects, supporting account mapping and seller-to-seller engagement, helping define joint value propositions, supporting partner-led pipeline generation, and influencing product and field strategy based on what is learned in-market.</p>\n<p>Internal operating docs show this motion consistently includes enablement sessions, QBR sponsorships, account planning, workshops, field events, and targeted campaigns designed to produce sourced and influenced pipeline.</p>\n<p>You&#39;ll be part of a team helping dbt scale its ecosystem through better partner capability, tighter field alignment, and more repeatable pipeline generation. The role is especially important as dbt continues investing in structured partner motions and deeper engagement with major cloud and data platform partners.</p>\n<p>What you&#39;ll do:</p>\n<ul>\n<li>Partner closely with Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n</ul>\n<ul>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n</ul>\n<ul>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n</ul>\n<ul>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n</ul>\n<ul>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n</ul>\n<ul>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n</ul>\n<ul>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n</ul>\n<ul>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n</ul>\n<ul>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n</ul>\n<ul>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n</ul>\n<ul>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<ul>\n<li>Travel approximately 30-40% to support partner planning, enablement, executive meetings, and field events across Japan</li>\n</ul>\n<p>This scope reflects how the Partner SA team is already operating: enabling partner field teams, building account-level alignment, supporting QBRs and regional events, and translating those activities into sourced and engaged pipeline.</p>\n<p>What you&#39;ll need:</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n</ul>\n<ul>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n</ul>\n<ul>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n</ul>\n<ul>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n</ul>\n<ul>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n</ul>\n<ul>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n</ul>\n<ul>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n</ul>\n<ul>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n</ul>\n<ul>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n</ul>\n<ul>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out:</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n</ul>\n<ul>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n</ul>\n<ul>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n</ul>\n<ul>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n</ul>\n<ul>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n</ul>\n<ul>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n</ul>\n<ul>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n</ul>\n<ul>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>What to expect in the interview process (all video interviews unless accommodations are needed):</p>\n<ul>\n<li>Interview with Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Interview with Hiring Manager</li>\n</ul>\n<ul>\n<li>Team Interviews</li>\n</ul>\n<ul>\n<li>Demo Round</li>\n</ul>\n<p>#LI-LA1</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_58a44dab-91a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673657005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner engineering","customer-facing technical role"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:53:29.744Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Japan - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner engineering, customer-facing technical role, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0b97b97d-56b"},"title":"Solutions Engineer (pre-sales)","description":"<p>About Mixpanel</p>\n<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel&#39;s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>\n<p>As a Solutions Engineer (pre-sales) at Mixpanel, you will partner closely with Account Executives, Account Managers, Product, Engineering, and Support to successfully roll out self-serve analytics within our customer&#39;s organizations, help the customer manage change, execute on technical projects and services that delight our customers and ultimately drive ROI on the customer&#39;s Mixpanel investment.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Support Sales Engineers and Account Executives in deal cycles, contributing to technical discovery and solution design</li>\n<li>Deliver standard and semi-customized product demos to prospects</li>\n<li>Assist in qualifying customer use cases and identifying opportunities where Mixpanel can provide value</li>\n<li>Contribute to proof-of-concept projects, including setup, execution, and documentation of results</li>\n<li>Provide guidance on implementation best practices, including instrumentation and data structure</li>\n<li>Collaborate with internal teams (Sales, Product, Engineering, Support) to ensure a smooth customer experience</li>\n<li>Build relationships with customer stakeholders and respond to technical questions</li>\n<li>Capture and share customer feedback with internal teams to inform product improvements</li>\n</ul>\n<p>We&#39;re Looking For Someone Who Has</p>\n<ul>\n<li>Ability to communicate with both technical and non-technical stakeholders</li>\n<li>Some experience supporting technical sales cycles, customer implementations, or consulting engagements</li>\n<li>3+ years of experience in Sales Engineering, Customer Success, Solutions Consulting, or a related role</li>\n<li>Working knowledge of data concepts such as SQL, event tracking, or analytics tools</li>\n<li>Familiarity with databases or cloud data warehouses (e.g., Snowflake, BigQuery, Redshift)</li>\n<li>Strong problem-solving skills with the ability to work on moderately complex, well-defined problems</li>\n<li>Solid communication and presentation skills</li>\n<li>Ability to manage multiple workstreams with guidance</li>\n<li>Interest in learning and applying new technologies, including AI tools</li>\n<li>Willingness to travel as needed</li>\n</ul>\n<p>Compensation</p>\n<p>The amount listed below is the total target cash compensation (TTCC) and includes base compensation and variable compensation in the form of either a company bonus or commissions. Variable compensation type is determined by your role and level. In addition to the cash compensation provided, this position is also eligible for equity consideration and other benefits including medical, vision, and dental insurance coverage.</p>\n<p>Our salary ranges are determined by role and level and are benchmarked to the SF Bay Area Technology data cut released by Radford, a global compensation database. The range displayed represents the minimum and maximum TTCC for new hire salaries for the position across all of our US locations. To stay on top of market conditions, we refresh our salary ranges twice a year so these ranges may change in the future. Within the range, individual pay is determined by experience, job-related skills, qualifications, and other factors.</p>\n<p>Mixpanel Compensation Range $170,000-$230,000 USD</p>\n<p>Benefits and Perks</p>\n<ul>\n<li>Comprehensive Medical, Vision, and Dental Care</li>\n<li>Mental Wellness Benefit</li>\n<li>Generous Vacation Policy &amp; Additional Company Holidays</li>\n<li>Enhanced Parental Leave</li>\n<li>Volunteer Time Off</li>\n<li>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</li>\n</ul>\n<p>Culture Values</p>\n<ul>\n<li>Make Bold Bets: We choose courageous action over comfortable progress.</li>\n<li>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</li>\n<li>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</li>\n<li>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</li>\n<li>Champion the Customer: We seek to deeply understand our customers&#39; needs, ensuring their success is our north star.</li>\n<li>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</li>\n</ul>\n<p>Why choose Mixpanel?</p>\n<p>We&#39;re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital. Mixpanel&#39;s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics. Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>\n<p>Choosing to work at Mixpanel means you&#39;ll be helping the world&#39;s most innovative companies learn from their data so they can make better decisions.</p>\n<p>Mixpanel is an equal opportunity employer supporting workforce diversity. At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have. We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply. We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, veteran status, or disability status. Pursuant to the San Francisco Fair Chance Ordinance or other similar laws that may be applicable, we will consider for employment qualified applicants with arrest and conviction records.</p>\n<p>We&#39;ve immersed ourselves in our Culture and Values as our guiding principles for the impact we want to have and the future</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0b97b97d-56b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mixpanel","sameAs":"https://mixpanel.com","logo":"https://logos.yubhub.co/mixpanel.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mixpanel/jobs/7800289","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$170,000-$230,000 USD","x-skills-required":["SQL","event tracking","analytics tools","databases","cloud data warehouses","Snowflake","BigQuery","Redshift"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:53:12.142Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, US (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, event tracking, analytics tools, databases, cloud data warehouses, Snowflake, BigQuery, Redshift","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":170000,"maxValue":230000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d301c7b8-b54"},"title":"Manager, Solutions Engineering","description":"<p>About Mixpanel</p>\n<p>Mixpanel turns data clarity into innovation. Its AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence.</p>\n<p>As a Manager on the Solutions Engineering team at Mixpanel, you will lead a talented group of analytics consultants who are pivotal to our success. You will be at the forefront of driving customer value, guiding your team as they serve as the primary technical resources for our Sales organisation.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Develop &amp; Mentor: Lead, coach, and grow a high-performing and inclusive team of Solutions Engineers, actively investing in their career development and upholding a high standard of performance.</li>\n</ul>\n<ul>\n<li>Drive Results: Partner closely with Sales leadership and Account Executives to provide technical expertise that drives new, retained, and expansion ARR. You will ensure your team&#39;s activities are directly contributing to the company&#39;s bottom line.</li>\n</ul>\n<ul>\n<li>Prioritise &amp; Problem Solve: Guide your team through complex customer evaluations and technical challenges. You will manage team resources effectively, aligning the right skills to customer needs to achieve productivity targets and successful outcomes.</li>\n</ul>\n<ul>\n<li>Cross-Functional Partnership: Act as a key technical liaison, collaborating with peer managers across Sales, Product, and Engineering. You will gather and synthesise customer feedback from your team to influence product strategy and solve problems at scale.</li>\n</ul>\n<ul>\n<li>Communicate &amp; Manage Change: Effectively translate broader company and departmental strategy into clear, actionable goals for your team. You will guide your direct reports through evolving business priorities with empathy and clarity.</li>\n</ul>\n<ul>\n<li>Hire the Best: Actively assess the needs of the team, build a pipeline of top talent, and hire outstanding individuals who elevate the team&#39;s capabilities and contribute to our inclusive culture.</li>\n</ul>\n<ul>\n<li>Innovate &amp; Raise the Bar: Relentlessly seek to improve how your team operates, from refining demo strategies and proof-of-concept methodologies to adopting new tools and processes that increase effectiveness and celebrate success.</li>\n</ul>\n<p>We&#39;re Looking For Someone Who</p>\n<ul>\n<li>Has progressive experience in a B2B SaaS environment, including 3+ years of people management experience leading a technical pre-sales, solutions engineering, or professional services team.</li>\n</ul>\n<ul>\n<li>Exhibits a &#39;player-coach&#39; mentality with deep knowledge in the data and analytics space. You are an expert on how data products (like CDPs, data warehouses, and analytics tools) are implemented and adopted by customers.</li>\n</ul>\n<ul>\n<li>Is a proven cross-functional partner with a track record of successfully working with sales teams to navigate complex deals and drive revenue.</li>\n</ul>\n<ul>\n<li>Demonstrates expertise in communicating complex technical concepts clearly and effectively to both technical and non-technical stakeholders.</li>\n</ul>\n<ul>\n<li>Is skilled at prioritising team activities and managing workload in a dynamic environment, balancing customer needs with efficiency goals.</li>\n</ul>\n<ul>\n<li>Is a natural mentor and developer of talent, with a passion for coaching and a history of building inclusive, high-achieving teams.</li>\n</ul>\n<ul>\n<li>Handles ambiguity with ease, demonstrating flexibility and a proactive, problem-solving mindset when adapting to new challenges and business priorities.</li>\n</ul>\n<ul>\n<li>Actively seeks feedback and is humble to learn, consistently looking for ways to improve themselves and their team.</li>\n</ul>\n<p>Bonus Points</p>\n<ul>\n<li>Previous experience in management consulting, strategic operations, or a similar role focused on go-to-market strategy.</li>\n</ul>\n<ul>\n<li>Direct, hands-on experience with Mixpanel or other product analytics tools like Amplitude, Pendo, or Contentsquare.</li>\n</ul>\n<ul>\n<li>Strong familiarity with the modern data stack, including tools like Snowflake, Google BigQuery, Segment, or Hightouch.</li>\n</ul>\n<p>Compensation</p>\n<p>The amount listed below is the total target cash compensation (TTCC) and includes base compensation and variable compensation in the form of either a company bonus or commissions. Variable compensation type is determined by your role and level. In addition to the cash compensation provided, this position is also eligible for equity consideration and other benefits including medical, vision, and dental insurance coverage.</p>\n<p>Our salary ranges are determined by role and level and are benchmarked to the SF Bay Area Technology data cut released by Radford, a global compensation database. The range displayed represents the minimum and maximum TTCC for new hire salaries for the position across all of our US locations. To stay on top of market conditions, we refresh our salary ranges twice a year so these ranges may change in the future. Within the range, individual pay is determined by experience, job-related skills, qualifications, and other factors.</p>\n<p>If you have questions about the specific range, your recruiter can share this information.</p>\n<p>Mixpanel Compensation Range: $238,300-$321,705 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d301c7b8-b54","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mixpanel","sameAs":"https://mixpanel.com","logo":"https://logos.yubhub.co/mixpanel.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mixpanel/jobs/7746430","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$238,300-$321,705 USD","x-skills-required":["product analytics","data and analytics","data products","CDPs","data warehouses","analytics tools","Mixpanel","Amplitude","Pendo","Contentsquare","Snowflake","Google BigQuery","Segment","Hightouch"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:51:13.741Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City, US (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"product analytics, data and analytics, data products, CDPs, data warehouses, analytics tools, Mixpanel, Amplitude, Pendo, Contentsquare, Snowflake, Google BigQuery, Segment, Hightouch","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":238300,"maxValue":321705,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_09a4d1ce-cde"},"title":"Data Engineer","description":"<p>We are looking for an experienced Data Engineer to partner with our Data Science and Data Infrastructure teams to own and scale our data pipelines. You&#39;ll also work closely with stakeholders across business teams including sales, marketing, and finance to ensure that the data they need arrives promptly and reliably.</p>\n<p>As a Data Engineer at Figma, you will be responsible for building and maintaining scalable data pipelines that connect various cloud data sources. You will develop a deep understanding of Figma&#39;s core data models and optimize data pipelines for scale. You will partner with the Data Science and Data Infrastructure teams to build new foundational data sets that are trusted, well understood, and enable self-service.</p>\n<p>You will work with a wide range of cross-functional stakeholders to derive requirements and architect shared datasets; ability to document, simplify and explain complex problems to different types of audiences. You will establish best practices for the development of specialized data sets for analytics and modeling.</p>\n<p>We&#39;d love to hear from you if you have:</p>\n<ul>\n<li>4+ years in a relevant field.</li>\n<li>Fluency with both SQL and Python.</li>\n<li>Familiarity with Snowflake, dbt, Dagster, and ETL/reverse ETL tools.</li>\n<li>Excellent judgment and creative problem-solving skills.</li>\n<li>A self-starting mindset along with strong communication and collaboration skills.</li>\n</ul>\n<p>While not required, it&#39;s an added plus if you also have:</p>\n<ul>\n<li>Knowledge in data modeling methodologies to design and build robust data architectures for insightful analytics.</li>\n<li>Experience with business systems such as Salesforce, Customer IO, Stripe, NetSuite is a big plus.</li>\n</ul>\n<p>At Figma, one of our values is Grow as you go. We believe in hiring smart, curious people who are excited to learn and develop their skills. If you&#39;re excited about this role but your past experience doesn&#39;t align perfectly with the points outlined in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_09a4d1ce-cde","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Figma","sameAs":"https://www.figma.com/","logo":"https://logos.yubhub.co/figma.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/figma/jobs/5220003004","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$140,000-$348,000 USD","x-skills-required":["SQL","Python","Snowflake","dbt","Dagster","ETL/reverse ETL tools"],"x-skills-preferred":["data modeling methodologies","business systems such as Salesforce, Customer IO, Stripe, NetSuite"],"datePosted":"2026-04-18T15:51:04.727Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA • New York, NY • United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, Snowflake, dbt, Dagster, ETL/reverse ETL tools, data modeling methodologies, business systems such as Salesforce, Customer IO, Stripe, NetSuite","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":140000,"maxValue":348000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_328a534b-bac"},"title":"Customer Sales Director (Austin, TX)","description":"<p>We are looking for a Customer Sales Director to focus on an at-scale strategy to support, retain, and grow a mix of our Commercial and Enterprise customer base. This role is a hybrid-based role in Austin, Texas.</p>\n<p>The ideal candidate will have 4+ years of experience in SaaS sales or account management, with a proven track record of exceeding targets. They will be able to build a strategic plan to drive expansion in a portfolio of Commercial and Enterprise accounts, manage multiple sales cycles and customer campaigns targeting Analytics Engineering, Data Platform, and Data Governance personas.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Building a strategic plan to drive expansion in a portfolio of Commercial and Enterprise accounts</li>\n<li>Managing multiple sales cycles and customer campaigns targeting Analytics Engineering, Data Platform, and Data Governance personas</li>\n<li>Protecting renewals by monitoring account signals, deepening executive alignment, and helping customers realize consistent value</li>\n</ul>\n<p>The successful candidate will have strong consultative selling skills, engaging effectively with both technical and business audiences. They will be proactive and organized, capable of independently managing a diverse book of business.</p>\n<p>Preferred qualifications include prior experience in analytics, ETL, BI, or open-source software, familiarity with dbt (core or Cloud) and the modern data stack, including platforms like Snowflake, BigQuery, Redshift, or Databricks, experience with consumption and/or usage-based pricing structures, and experience with the MEDD(P)ICC sales methodology / Command of the Message.</p>\n<p>Benefits include unlimited vacation time, 401k plan with 3% guaranteed company contribution, comprehensive healthcare coverage, generous paid parental leave, flexible stipends for health &amp; wellness, home office setup, cell phone &amp; internet, learning &amp; development, and office space.</p>\n<p>We offer competitive compensation packages commensurate with experience, including salary, equity, and where applicable, performance-based pay.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_328a534b-bac","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4616931005","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SaaS sales","account management","analytics","ETL","BI","open-source software","dbt","Snowflake","BigQuery","Redshift","Databricks","consumption and/or usage-based pricing structures","MEDD(P)ICC sales methodology / Command of the Message"],"x-skills-preferred":["prior experience in analytics","familiarity with dbt (core or Cloud)","experience with consumption and/or usage-based pricing structures"],"datePosted":"2026-04-18T15:51:03.617Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Austin, Texas"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"SaaS sales, account management, analytics, ETL, BI, open-source software, dbt, Snowflake, BigQuery, Redshift, Databricks, consumption and/or usage-based pricing structures, MEDD(P)ICC sales methodology / Command of the Message, prior experience in analytics, familiarity with dbt (core or Cloud), experience with consumption and/or usage-based pricing structures"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_bb468a26-d1d"},"title":"Accounting Manager, GL Operations & Intercompany","description":"<p>Ready to be pushed beyond what you think you’re capable of?</p>\n<p>At Coinbase, our mission is to increase economic freedom in the world.</p>\n<p>We are seeking a highly motivated accounting professional to join the Controllership team and play a key role in our continued growth and achievement of operational excellence within the Global GL Operations team.</p>\n<p>As Coinbase continues to expand its global footprint, the Intercompany and GL Operations team plays a critical role in ensuring the integrity of our global consolidation and the scalability of our financial infrastructure.</p>\n<p>Reporting to the Assistant Controller, the Accounting Manager, Intercompany and GL Operations will play a pivotal role in overseeing Coinbase’s global intercompany framework and general ledger functions.</p>\n<p>This position will manage the month-end close process for intercompany settlements, transfer pricing execution, and global consolidations.</p>\n<p>The ideal candidate will have the opportunity to work with leaders across Tax, Treasury, and Finance Data to operationalize new entity launches and support the growth of a dynamic, high-growth technology company.</p>\n<p><strong>Key Responsibilities:</strong></p>\n<ul>\n<li>Lead Global Close Operations: Oversee the team performing monthly close activities for global intercompany transactions, ensuring timely and accurate eliminations and consolidated reporting.</li>\n</ul>\n<ul>\n<li>Cross-Functional Partnership: Partner with Tax, Treasury, and Legal to understand new intercompany agreements and transfer pricing methodologies; design and implement scalable operational processes to support these models.</li>\n</ul>\n<ul>\n<li>Process Ownership: Manage all global processes for intercompany billing, cash settlements, and foreign exchange (FX) gain/loss analysis, ensuring high levels of precision and operational excellence.</li>\n</ul>\n<ul>\n<li>Control Environment: Lead the design, implementation, and documentation of SOX-compliant internal controls over intercompany and GL processes to support Coinbase’s evolving regulatory requirements.</li>\n</ul>\n<ul>\n<li>Data Strategy: Collaborate with Finance Data and Engineering to investigate and resolve platform-related intercompany data issues and automate manual reconciliation workflows.</li>\n</ul>\n<ul>\n<li>Audit &amp; Reporting: Lead the preparation of quarterly audit schedules and provide expert support for internal and external audits related to global entity management and consolidations.</li>\n</ul>\n<ul>\n<li>Strategic Projects: Support and participate in GL-related projects, including M&amp;A integration, new entity set-ups, and ERP enhancements (NetSuite).</li>\n</ul>\n<ul>\n<li>Team Leadership: Mentor and develop staff (L4/L5), fostering a culture of continuous improvement and professional growth.</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>CPA required</li>\n</ul>\n<ul>\n<li>6+ years of experience in Finance/Accounting, with minimum 2 years in a leadership role</li>\n</ul>\n<ul>\n<li>Bachelor’s degree in Accounting, Finance or related field</li>\n</ul>\n<ul>\n<li>Knowledge of U.S. GAAP and Internal Controls</li>\n</ul>\n<ul>\n<li>Experience with implementing SOX 404 as well as on-going documentation</li>\n</ul>\n<ul>\n<li>A demonstrated history of solving multiple and complex operation and accounting challenges</li>\n</ul>\n<ul>\n<li>Project management and systems implementation experience</li>\n</ul>\n<ul>\n<li>Ability to be adaptable and able to work in a dynamic work environment</li>\n</ul>\n<ul>\n<li>Ability to work collaboratively across departmental functions</li>\n</ul>\n<ul>\n<li>Demonstrates the ability to responsibly use generative AI tools and copilots (e.g., LibreChat, Gemini, Glean) in daily workflows, continuously learn as tools evolve, and apply human‑in‑the‑loop practices to deliver business‑ready outputs and drive measurable improvements in efficiency, cost, and quality.</li>\n</ul>\n<p><strong>Nice to Have:</strong></p>\n<ul>\n<li>Experience in the FinTech or Crypto industry.</li>\n</ul>\n<ul>\n<li>Advanced knowledge of SQL and experience working with Snowflake, Looker, or similar big-data analytics tools.</li>\n</ul>\n<ul>\n<li>Experience with M&amp;A integration and international entity expansion.</li>\n</ul>\n<ul>\n<li>Public accounting experience (Big 4 preferred).</li>\n</ul>\n<ul>\n<li>Experience with Netsuite and G-Suite Workspace Collaboration tools.</li>\n</ul>\n<p><strong>Pay Transparency Notice:</strong></p>\n<p>Depending on your work location, the target annual base salary for this position can range as detailed below. Total compensation may also include equity and bonus eligibility and benefits (including medical, dental, vision and 401(k)).</p>\n<p>Annual base salary range (excluding equity and bonus):</p>\n<p>$166,345-$195,700 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_bb468a26-d1d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Coinbase","sameAs":"https://www.coinbase.com/","logo":"https://logos.yubhub.co/coinbase.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coinbase/jobs/7779094","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$166,345-$195,700 USD","x-skills-required":["CPA","U.S. GAAP","Internal Controls","SOX 404","Project Management","Systems Implementation"],"x-skills-preferred":["SQL","Snowflake","Looker","M&A Integration","International Entity Expansion","Public Accounting","Netsuite","G-Suite Workspace Collaboration"],"datePosted":"2026-04-18T15:50:53.737Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - USA"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"CPA, U.S. GAAP, Internal Controls, SOX 404, Project Management, Systems Implementation, SQL, Snowflake, Looker, M&A Integration, International Entity Expansion, Public Accounting, Netsuite, G-Suite Workspace Collaboration","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":166345,"maxValue":195700,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d2b1604a-c20"},"title":"Applied AI Engineer","description":"<p>We are seeking an Applied AI Engineer to join our team at Komodo Health. As an Applied AI Engineer, you will design and deploy end-to-end AI solutions that power real products and internal tools. You&#39;ll work at the intersection of applied research, engineering, and product development, bringing modern AI techniques into scalable production systems.</p>\n<p>You will collaborate closely with product, platform, and data teams to build AI capabilities that transform how healthcare data is explored, understood, and operationalized. Your work will involve designing, building, and deploying agent-based AI pipelines integrated into real customer-facing products, as well as building internal AI productivity tools that accelerate engineering workflows across Komodo.</p>\n<p>In this role, you will have the opportunity to work on a wide range of projects, from developing AI-powered applications to integrating AI capabilities across backend services and product interfaces. You will also contribute reusable patterns to Komodo&#39;s AI infrastructure and internal tooling ecosystem.</p>\n<p>To be successful in this role, you will need to have experience building production-grade AI systems or AI-powered applications, strong proficiency in Python, and experience working with LLMs, prompt engineering, or agent-based architectures. You will also need to be able to integrate AI capabilities across backend services and product interfaces, and have experience designing evaluation frameworks, testing strategies, or monitoring systems for AI features.</p>\n<p>If you are passionate about using AI to drive innovation and improvement in healthcare, and have the skills and experience to succeed in this role, we encourage you to apply.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Design and deploy end-to-end AI solutions that power real products and internal tools</li>\n<li>Collaborate closely with product, platform, and data teams to build AI capabilities that transform how healthcare data is explored, understood, and operationalized</li>\n<li>Develop agent-based AI pipelines integrated into real customer-facing products</li>\n<li>Build internal AI productivity tools that accelerate engineering workflows across Komodo</li>\n<li>Contribute reusable patterns to Komodo&#39;s AI infrastructure and internal tooling ecosystem</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Experience building production-grade AI systems or AI-powered applications</li>\n<li>Strong proficiency in Python</li>\n<li>Experience working with LLMs, prompt engineering, or agent-based architectures</li>\n<li>Ability to integrate AI capabilities across backend services and product interfaces</li>\n<li>Experience designing evaluation frameworks, testing strategies, or monitoring systems for AI features</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Healthcare data expertise</li>\n<li>Experience with distributed computing frameworks (e.g., Spark, Snowflake, Databricks) for large-scale data processing</li>\n</ul>\n<p><strong>Location</strong></p>\n<p>This role is located in San Francisco, California, and is available for remote work.</p>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Comprehensive health, dental, and vision insurance</li>\n<li>Flexible time off and holidays</li>\n<li>401(k) with company match</li>\n<li>Disability insurance and life insurance</li>\n<li>Leaves of absence in accordance with applicable state and local laws and regulations and company policy</li>\n</ul>\n<p><strong>Equal Opportunity Employer</strong></p>\n<p>Komodo Health is an equal opportunity employer and welcomes applications from all qualified candidates. We are committed to diversity and inclusion in the workplace and strive to create a work environment that is free from discrimination and harassment.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d2b1604a-c20","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Komodo Health","sameAs":"https://www.komodohealth.com/","logo":"https://logos.yubhub.co/komodohealth.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/komodohealth/jobs/8512178002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$191,000 - $224,000 per year","x-skills-required":["Python","LLMs","Prompt Engineering","Agent-Based Architectures","Distributed Computing Frameworks","Spark","Snowflake","Databricks"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:50:46.591Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"Python, LLMs, Prompt Engineering, Agent-Based Architectures, Distributed Computing Frameworks, Spark, Snowflake, Databricks","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":224000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ae53b8d4-8fd"},"title":"Sr. AI Engineer, Application Engineering","description":"<p>We&#39;re hiring a Sr. AI Engineer to join our IT department to manage AI agentic deployments and deliver real impact for our customers. As a Sr. AI Engineer, you will design and develop tailored solutions using the Elastic Agent Builder platform and related technologies, guide technical engagements, and support the growth of junior engineers.</p>\n<p>Your responsibilities will include:</p>\n<ul>\n<li>Technical Delivery and Implementation: Own the end-to-end technical delivery of AI solutions, including writing code, configuring systems, and resolving issues, while reviewing the work of junior team members to ensure quality deployment and measurable business impact.</li>\n</ul>\n<ul>\n<li>AI Solution Development: Take ownership of designing and implementing scalable production systems, including AI and Large Language Model (LLM) based intelligent agents and automated workflows built on the Salesforce platform.</li>\n</ul>\n<ul>\n<li>Custom Agentic AI Engineering: Work directly with stakeholders to design and build custom intelligent agents using the Elastic Agent Builder platform, ensuring solutions meet unique business requirements and integrate smoothly with existing tool ecosystems.</li>\n</ul>\n<ul>\n<li>Data Configuration and Integration: Own the full data lifecycle, from data model design to building efficient processing pipelines and establishing integration strategies. Ensure data is optimized and secure for AI applications, including in complex enterprise environments.</li>\n</ul>\n<ul>\n<li>Technical Problem Solving: Identify, analyze, and resolve technical challenges across all phases of solution delivery, from data integration to model deployment and agent orchestration. Serve as a reliable resource for unblocking progress.</li>\n</ul>\n<ul>\n<li>Agentic Innovation: Develop expertise in the Elastic platform, pushing its capabilities forward. Lead the development of custom intelligent agents, automate business processes, and shape user experiences. Insights from the field will directly influence product enhancements and platform direction.</li>\n</ul>\n<ul>\n<li>Client Partnership: Embed with client teams to understand their operational challenges and goals. Translate requirements into clear technical designs, build strong relationships, and serve as a trusted technical advisor.</li>\n</ul>\n<ul>\n<li>Debugging and Root Cause Analysis: Perform thorough analysis, debugging, and root cause identification for complex system interactions, data flows, and AI model behaviors to optimize performance and prevent recurring issues.</li>\n</ul>\n<ul>\n<li>Prototyping and Iteration: Rapidly develop proofs-of-concept and minimum viable products, often coding alongside client teams to demonstrate capabilities and gather feedback for iterative refinement.</li>\n</ul>\n<ul>\n<li>Engineering Best Practices: Apply and promote standards for code quality, scalability, security, and maintainability across all deployed solutions.</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>At least 5 years&#39; experience in a hands-on, end-to-end delivery role for scalable production solutions in a professional environment</li>\n</ul>\n<ul>\n<li>Expert-level proficiency in one or more programming languages (e.g., JavaScript, Java, Python)</li>\n</ul>\n<ul>\n<li>Extensive experience building and deploying solutions with AI/LLM technologies, including integrating LLMs, applying AI orchestration frameworks (e.g., LangChain, LlamaIndex), prompt engineering techniques, and agentic frameworks</li>\n</ul>\n<ul>\n<li>Deep expertise in data modeling, processing, integration, and analytics, with proficiency in enterprise data platforms (e.g., Salesforce Data Cloud, Snowflake, Databricks, BigQuery)</li>\n</ul>\n<ul>\n<li>Strong collaboration, communication, and presentation skills, both written and verbal, with the ability to explain complex technical concepts to technical and non-technical partners</li>\n</ul>\n<ul>\n<li>Track record of leading technical engagements, mentoring junior team members, and taking responsibility for technical aspects of projects</li>\n</ul>\n<p>This role is eligible to participate in Elastic&#39;s stock program and has a competitive salary range of $94,300-$149,200 USD, with an alternate range of $113,300-$179,200 USD in select locations.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ae53b8d4-8fd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Elastic","sameAs":"https://www.elastic.co/","logo":"https://logos.yubhub.co/elastic.co.png"},"x-apply-url":"https://job-boards.greenhouse.io/elastic/jobs/7722032","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$94,300-$149,200 USD","x-skills-required":["JavaScript","Java","Python","AI/LLM technologies","LangChain","LlamaIndex","prompt engineering techniques","agentic frameworks","data modeling","processing","integration","analytics","Salesforce Data Cloud","Snowflake","Databricks","BigQuery"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:49:59.873Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"JavaScript, Java, Python, AI/LLM technologies, LangChain, LlamaIndex, prompt engineering techniques, agentic frameworks, data modeling, processing, integration, analytics, Salesforce Data Cloud, Snowflake, Databricks, BigQuery","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":94300,"maxValue":149200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e40d534f-76a"},"title":"Resident Architect","description":"<p>About Us</p>\n<p>dbt Labs is the pioneer of analytics engineering, helping data teams transform raw data into reliable, actionable insights. As of February 2025, we&#39;ve surpassed $100 million in annual recurring revenue (ARR) and serve more than 5,400 dbt Platform customers.</p>\n<p>We&#39;re seeking an experienced Resident Architect (RA) with a passion for solving challenging problems with dbt to join our Professional Services team. RAs are billable to dbt Enterprise customers and help achieve our mission to empower data developers to create and disseminate organisational knowledge.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects - inclusive of implementation, troubleshooting configurations, instilling best practices, and solutioning MVPs and long-term solutions to customer-specific requirements</li>\n</ul>\n<ul>\n<li>Consult on architecture and design</li>\n</ul>\n<ul>\n<li>Ensure our most strategic enterprise customers are adopting the product</li>\n</ul>\n<ul>\n<li>Collaborate with other internal customer-facing teams at dbt Labs - Sales, Solution Architects, Training, Support</li>\n</ul>\n<ul>\n<li>Provide critical feedback to dbt Labs product and engineering teams to improve and prioritise customer requests and ensure rapid resolution for engagement-specific issues</li>\n</ul>\n<ul>\n<li>Become a product expert with dbt in the context of the modern data stack (if you aren&#39;t already)</li>\n</ul>\n<p>What You&#39;ll Need</p>\n<ul>\n<li>4+ years&#39; experience working with technical data tooling, even better if it is in a customer-facing post-sales, technical architect or consulting role</li>\n</ul>\n<ul>\n<li>Deep expertise in at least one data platform (Snowflake, Databricks, BigQuery, Redshift)</li>\n</ul>\n<ul>\n<li>Experience using, deploying, or configuring dbt in an enterprise setting - working with dbt for minimum 1 year</li>\n</ul>\n<ul>\n<li>Proficiency in writing SQL and Python in analytics contexts</li>\n</ul>\n<ul>\n<li>You look forward to building skills in technical areas that support deployment and integration of dbt enterprise solutions to complete customer projects</li>\n</ul>\n<ul>\n<li>Customer focus, embracing one of core values that users are our best advocates</li>\n</ul>\n<ul>\n<li>Strong organisational skills with the ability to manage multiple technical projects simultaneously - including defining scope, tracking timelines, and ensuring deliverables are met</li>\n</ul>\n<ul>\n<li>Clear and concise communicator with the ability to engage internal and external stakeholders, effectively explain complex technical or organisational challenges, and propose thoughtful, iterative solutions</li>\n</ul>\n<ul>\n<li>The ability to thrive in a remote organisation that highly values transparency and cross-collaboration</li>\n</ul>\n<ul>\n<li>Travel approximately 2-4x/year for customer onsite sessions, team offsites, and company events will be expected</li>\n</ul>\n<p>What Will Make You Stand Out</p>\n<ul>\n<li>You have obtained the dbt Analytics Engineering Certification</li>\n</ul>\n<ul>\n<li>You have the ability to advise on dbt enterprise recommendations, and build direction/consensus with the customer to move forward</li>\n</ul>\n<ul>\n<li>Experience with traditional Enterprise ETL tooling (Informatica, Datastage, Talend)</li>\n</ul>\n<p>Remote Hiring Process</p>\n<ul>\n<li>Interview with a Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Hiring Manager Interview</li>\n</ul>\n<ul>\n<li>Technical Task + Presentation</li>\n</ul>\n<ul>\n<li>Team Interview</li>\n</ul>\n<p>Benefits</p>\n<ul>\n<li>Unlimited vacation time with a culture that actively encourages time off</li>\n</ul>\n<ul>\n<li>401k plan with 3% guaranteed company contribution</li>\n</ul>\n<ul>\n<li>Comprehensive healthcare coverage</li>\n</ul>\n<ul>\n<li>Generous paid parental leave</li>\n</ul>\n<ul>\n<li>Flexible stipends for:</li>\n</ul>\n<ul>\n<li>Health &amp; Wellness</li>\n</ul>\n<ul>\n<li>Home Office Setup</li>\n</ul>\n<ul>\n<li>Cell Phone &amp; Internet</li>\n</ul>\n<ul>\n<li>Learning &amp; Development</li>\n</ul>\n<ul>\n<li>Office Space</li>\n</ul>\n<p>Compensation</p>\n<p>We offer competitive compensation packages commensurate with experience, including salary, equity, and where applicable, performance-based pay. Our Talent Acquisition Team can answer questions around dbt Lab&#39;s total rewards during your interview process.</p>\n<p>In select locations (including Boston, Chicago, Denver, Los Angeles, Philadelphia, New York City, San Francisco, Washington, DC, and Seattle), an alternate range may apply, as specified below.</p>\n<ul>\n<li>The typical starting salary range for this role is:</li>\n</ul>\n<p>$114,000 - $137,700</p>\n<ul>\n<li>The typical starting salary range for this role in the select locations listed is:</li>\n</ul>\n<p>$126,000 - $153,000</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e40d534f-76a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4627942005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$114,000 - $137,700","x-skills-required":["dbt","data platform","Snowflake","Databricks","BigQuery","Redshift","SQL","Python","analytics engineering"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:49:56.862Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"dbt, data platform, Snowflake, Databricks, BigQuery, Redshift, SQL, Python, analytics engineering","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":114000,"maxValue":137700,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_760c3e88-e35"},"title":"Senior Product Manager, Data","description":"<p>Job Title: Senior Product Manager, Data</p>\n<p>We are seeking a Senior Product Manager to support the development of CoreWeave&#39;s Enterprise Data Platform within the CIO organization. This role will contribute to building a scalable, high-performance data lake and data architecture, integrating data from key sources across Operations, Engineering, Sales, Finance, and other IT partners.</p>\n<p>As a Senior Product Manager for Data Infrastructure and Analytics, you will help drive data ingestion, transformation, governance, and analytics enablement. You will collaborate with engineering, analytics, finance, and business teams to help deliver data lake and pipeline orchestration solutions, ensuring accessible data for business insights.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Own and evangelize Data Platform and Business Analytics roadmap and strategy across CoreWeave</li>\n<li>Assist with the execution of CoreWeave&#39;s enterprise data architecture, helping enable the data lake and domain-driven data layer</li>\n<li>Support the development and enhancement of data ingestion, transformation, and orchestration pipelines for scalability, efficiency, and reliability</li>\n<li>Work with the Engineering and Data teams to maintain and enhance data pipelines for both structured and unstructured data, enabling efficient data movement across the organization</li>\n<li>Collaborate with Finance, GTM, Infrastructure, Data Center, and Supply Chain teams to help unify and model data from core systems (ERP, CRM, Asset Mgmt, Supply Chain systems, etc.)</li>\n<li>Contribute to data governance and quality initiatives, focusing on data consistency, lineage tracking, and compliance with security standards</li>\n<li>Support the BI and analytics layer by partnering with stakeholders to enable data products, dashboards, and reporting capabilities</li>\n<li>Help prioritize data-driven initiatives, ensuring alignment with business goals and operational needs in coordination with leadership</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>5+ years of experience in data product management, data architecture, or enterprise data engineering roles</li>\n<li>Familiarity with data lakes, data warehouses, ETL/ELT and streaming pipelines, and data governance frameworks</li>\n<li>Hands-on experience with modern data stack technologies (such as Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka)</li>\n<li>Understanding of data modeling, domain-driven design, and creating scalable data platforms</li>\n<li>Experience supporting the end-to-end data product lifecycle, including requirements gathering and implementation</li>\n<li>Strong collaboration skills with engineering, analytics, and business teams to help deliver data initiatives</li>\n<li>Awareness of data security, compliance, and governance best practices</li>\n<li>Understanding of BI and analytics platforms (such as Tableau, Looker, Power BI) and supporting self-service analytics</li>\n</ul>\n<p>Why CoreWeave?</p>\n<p>At CoreWeave, we work hard, have fun, and move fast! We&#39;re in an exciting stage of hyper-growth that you will not want to miss out on. We&#39;re not afraid of a little chaos, and we&#39;re constantly learning. Our team cares deeply about how we build our product and how we work together, which is represented through our core values:</p>\n<ul>\n<li>Be Curious at Your Core</li>\n<li>Act Like an Owner</li>\n<li>Empower Employees</li>\n<li>Deliver Best-in-Class Client Experiences</li>\n<li>Achieve More Together</li>\n</ul>\n<p>We support and encourage an entrepreneurial outlook and independent thinking. We foster an environment that encourages collaboration and provides the opportunity to develop innovative solutions to complex problems. As we get set for take off, the growth opportunities within the organization are constantly expanding. You will be surrounded by some of the best talent in the industry, who will want to learn from you, too. Come join us!</p>\n<p>Salary Range: $143,000 to $210,000</p>\n<p>Benefits:</p>\n<ul>\n<li>Medical, dental, and vision insurance - 100% paid for by CoreWeave</li>\n<li>Company-paid Life Insurance</li>\n<li>Voluntary supplemental life insurance</li>\n<li>Short and long-term disability insurance</li>\n<li>Flexible Spending Account</li>\n<li>Health Savings Account</li>\n<li>Tuition Reimbursement</li>\n<li>Ability to Participate in Employee Stock Purchase Program (ESPP)</li>\n<li>Mental Wellness Benefits through Spring Health</li>\n<li>Family-Forming support provided by Carrot</li>\n<li>Paid Parental Leave</li>\n<li>Flexible, full-service childcare support with Kinside</li>\n<li>401(k) with a generous employer match</li>\n<li>Flexible PTO</li>\n<li>Catered lunch each day in our office and data center locations</li>\n<li>A casual work environment</li>\n<li>A work culture focused on innovative disruption</li>\n</ul>\n<p>Workplace:</p>\n<p>While we prioritize a hybrid work environment, remote work may be considered for candidates located more than 30 miles from an office, based on role requirements for specialized skill sets. New hires will be invited to attend onboarding at one of our hubs within their first month. Teams also gather quarterly to support collaboration.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_760c3e88-e35","directApply":true,"hiringOrganization":{"@type":"Organization","name":"CoreWeave","sameAs":"https://www.coreweave.com","logo":"https://logos.yubhub.co/coreweave.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coreweave/jobs/4649824006","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$143,000 to $210,000","x-skills-required":["data product management","data architecture","enterprise data engineering","data lakes","data warehouses","ETL/ELT and streaming pipelines","data governance frameworks","modern data stack technologies","Snowflake","BigQuery","Databricks","Apache Spark","Airflow","DBT","Kafka","data modeling","domain-driven design","scalable data platforms","BI and analytics platforms","Tableau","Looker","Power BI"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:48:58.405Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Livingston, NJ / New York, NY / Sunnyvale, CA / Bellevue, WA/San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data product management, data architecture, enterprise data engineering, data lakes, data warehouses, ETL/ELT and streaming pipelines, data governance frameworks, modern data stack technologies, Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka, data modeling, domain-driven design, scalable data platforms, BI and analytics platforms, Tableau, Looker, Power BI","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":143000,"maxValue":210000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9d394a58-904"},"title":"Accounting Manager, Crypto Assets","description":"<p>Ready to be pushed beyond what you think you’re capable of?</p>\n<p>At Coinbase, our mission is to increase economic freedom in the world.</p>\n<p>We&#39;re seeking a highly motivated accounting professional to join our Accounting team and play a key role in our continued growth and achievement of operational excellence. The Accounting Manager, Platform will have the opportunity to make meaningful impact on a dynamic, high-growth technology company.</p>\n<p>Reporting to the Assistant Controller, the Accounting Manager will be responsible for reconciling and analyzing platform data in the company&#39;s financial / platform systems. This position requires impeccable attention to detail and a focus on process improvement.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Reconcile and analyze ending balances</li>\n<li>Assist with the launch of new digital assets and products impacting platform accounting operations</li>\n<li>Ensure that appropriate internal controls over platform accounting are in place</li>\n<li>Operationalize new areas of the business to facilitate streamlined accounting transactions</li>\n<li>With major system experience, recognize the links upstream/downstream and verbalize the impacts &amp; reliance in connection with changes or enhancements</li>\n<li>Provide project requirements and coordinate, manage and oversee any testing to improve existing processes and automate any related systems</li>\n<li>Provide support for internal and external audits</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>9+ years of progressive accounting experience with a mix of Big 4 public accounting and industry experience in the fintech/financial services industries</li>\n<li>Minimum 2 years of experience in a manager role.</li>\n<li>Financial services industry experience such as fintech, payments and/or broker dealers</li>\n<li>B.A. or B.S degree in Accounting, Finance or related field</li>\n<li>A demonstrated history of solving multiple and complex operation and accounting challenges</li>\n<li>Desire to exceed expectations and drive positive change in the organization</li>\n<li>Operational processes experience</li>\n<li>Advanced Microsoft Excel skills, with the ability to manipulate and digest mass amounts of data required</li>\n<li>Individual should work well in a dynamic environment and be able to recommend and implement process improvements</li>\n<li>Demonstrates the ability to responsibly use generative AI tools and copilots (e.g., LibreChat, Gemini, Glean) in daily workflows, continuously learn as tools evolve, and apply human‑in‑the‑loop practices to deliver business‑ready outputs and drive measurable improvements in efficiency, cost, and quality.</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Experience working in Netsuite, FloQast and Google Suite</li>\n<li>Experience working in a SOX control environment</li>\n<li>Basic knowledge of digital assets and the crypto economy, or a desire to learn</li>\n<li>Basic SQL knowledge and experience with reporting/analytics tools (Snowflake, Looker, etc.)</li>\n<li>Experience in working at bank, fin-tech, financial regulatory agency and/or professional service firm serving financial institution clients</li>\n</ul>\n<p><strong>Pay Transparency Notice</strong></p>\n<p>Annual base salary range (excluding equity and bonus): $166,345-$195,700 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9d394a58-904","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Coinbase","sameAs":"https://www.coinbase.com/","logo":"https://logos.yubhub.co/coinbase.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coinbase/jobs/7793903","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$166,345-$195,700 USD","x-skills-required":["accounting","fintech","financial services","Microsoft Excel","Netsuite","FloQast","Google Suite","SOX control","digital assets","crypto economy","SQL","reporting/analytics tools","Snowflake","Looker"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:48:47.084Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - USA"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"accounting, fintech, financial services, Microsoft Excel, Netsuite, FloQast, Google Suite, SOX control, digital assets, crypto economy, SQL, reporting/analytics tools, Snowflake, Looker","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":166345,"maxValue":195700,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3168d7d3-70b"},"title":"Partner Solutions Architect - North America","description":"<p>About Us</p>\n<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across North America. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>As a Partner Solutions Architect, you will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud. Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Partner closely with North America Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>Benefits</p>\n<ul>\n<li>Unlimited vacation (and yes we use it!)</li>\n<li>Pension coverage</li>\n<li>Excellent healthcare</li>\n<li>Paid Parental Leave</li>\n<li>Wellness stipend</li>\n<li>Home office stipend, and more!</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3168d7d3-70b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673630005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner development","field engineering","sales engineering","consulting","partner engineering"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:48:30.813Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Canada - Remote; US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner development, field engineering, sales engineering, consulting, partner engineering, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_14d46b7f-188"},"title":"Senior Analyst, Field Analytics","description":"<p>We are looking for a Senior Analyst, Field Analytics to join our Go-To-Market Strategy &amp; Operations group. As part of this team, you will drive insight and scale within the global field organisation by building high-impact technical assets, ranging from executive Tableau dashboards to standardised Snowflake datasets.</p>\n<p>Your responsibilities will include designing, building, and maintaining high-visibility Tableau dashboards and reporting assets that provide actionable insights to business partners across the global organisation. You will also build and optimise production-grade data sets in Snowflake, ensuring that all field data (Pipeline, Bookings, Productivity) is clean, structured, and easily accessible for self-service analysis.</p>\n<p>In addition, you will take ownership of the technical documentation for all GTM reporting assets, ensuring data lineage, metric definitions, and logic are clearly defined and accessible. You will also champion the use of Generative AI tools to accelerate the analytics lifecycle, including automating SQL query generation, streamlining data preparation, and enhancing report documentation.</p>\n<p>To succeed in this role, you will need to have 5+ years of professional experience in Data Analytics or Business Intelligence, ideally within a global delivery model. You will also need to have technical stack expertise in SQL (Snowflake), BI visualisation tool (Tableau), and CRM data (Salesforce).</p>\n<p>Preferred qualifications include proficiency with scripting (python) &amp; data modelling (dbt), deep understanding of enterprise software sales processes, field operations, and cross-functional GTM mechanics.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_14d46b7f-188","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7728562","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL (Snowflake)","BI visualisation tool (Tableau)","CRM data (Salesforce)","Generative AI tools","Data Analytics","Business Intelligence"],"x-skills-preferred":["Scripting (python)","Data modelling (dbt)","Enterprise software sales processes","Field operations","Cross-functional GTM mechanics"],"datePosted":"2026-04-18T15:48:12.823Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL (Snowflake), BI visualisation tool (Tableau), CRM data (Salesforce), Generative AI tools, Data Analytics, Business Intelligence, Scripting (python), Data modelling (dbt), Enterprise software sales processes, Field operations, Cross-functional GTM mechanics"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_dceb42f0-405"},"title":"Accounting and Regulatory Reporting Manager, Canada","description":"<p>Ready to be pushed beyond what you think you’re capable of?</p>\n<p>At Coinbase, our mission is to increase economic freedom in the world.</p>\n<p>We&#39;re seeking a very specific candidate who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system.</p>\n<p>Our work culture is intense and isn’t for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there’s no better place to be.</p>\n<p>While many roles at Coinbase are remote-first, we are not remote-only. In-person participation is required throughout the year. Team and company-wide offsites are held multiple times annually to foster collaboration, connection, and alignment.</p>\n<p>Coinbase Canada seeks an Accounting and Regulatory Reporting Manager to join our Global Financial and Regulatory Reporting team.</p>\n<p>Reporting to the Director of Reporting and Technical Policy, you will serve as the primary Finance owner for all recurring regulatory filings, including Form 31-103F1, CIRO Form 1, and other OSC/CIRO submissions.</p>\n<p>This technical role involves maintaining IFRS books of account and collaborating with cross-functional teams (Tax, Treasury, Legal) to ensure capital adequacy.</p>\n<p>You will drive operational excellence by designing scalable processes, building robust internal controls, and automating reporting cycles for a high-growth tech environment.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Oversee the financial and regulatory reporting functions in Canada and ensure these processes operate effectively and accurately, following the Coinbase centralized model, delivering accurate regulatory and financial insights into the business.</li>\n</ul>\n<ul>\n<li>Own the Company’s parallel books of accounts maintained in line with IFRS and maintain appropriate reconciliations and valid support for all adjustments.</li>\n</ul>\n<ul>\n<li>Own end-to-end preparation, review, and coordination of all recurring regulatory filings.</li>\n</ul>\n<ul>\n<li>Responsible for preparation and coordination of financial statements, regulatory reporting, and other related information for external use (audit, submission to regulators, etc.), and play a leading role in external communication as the representative of the Finance department.</li>\n</ul>\n<ul>\n<li>Prepare Coinbase Canada monthly management reporting and analytical review thereof, including reporting and communication to the Board.</li>\n</ul>\n<ul>\n<li>Act as a contact point within the company for financial reporting queries and queries from regulatory authorities relating to the financial and regulatory reporting obligations of the firm.</li>\n</ul>\n<ul>\n<li>Prepare and update various finance policies and procedures.</li>\n</ul>\n<ul>\n<li>Monitor and oversee the performance of outsourced services by outsourced service providers relating to the finance function and assess KPIs and SLAs.</li>\n</ul>\n<ul>\n<li>Active in Coinbase Canada liquidity management and coordinating closely with the treasury team.</li>\n</ul>\n<ul>\n<li>Participate in critical projects, including the transitioning to new external vendors, integration, automation or enhancements.</li>\n</ul>\n<ul>\n<li>Support ad hoc financial information requests and analyses and other special projects as assigned.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>6+ years&#39; experience as a regulatory reporting or finance professional at a bank, fin-tech, financial regulatory agency, securities dealer or investment dealer, and/or professional service firm serving financial institution clients.</li>\n</ul>\n<ul>\n<li>3+ years experience working in a regulated entity/environment, with direct involvement in the preparation or review of regulatory filings for a securities regulator (e.g., OSC, CIRO, IIROC, or equivalent).</li>\n</ul>\n<ul>\n<li>Previous Board/senior management level experience, or demonstrated the ability to communicate directly to the Board/senior management.</li>\n</ul>\n<ul>\n<li>CPA or equivalent required.</li>\n</ul>\n<ul>\n<li>Ability to work collaboratively with a broad range of business functions, with an emphasis on senior management and a track record of performing in a management role.</li>\n</ul>\n<p>Project management experience is essential.</p>\n<ul>\n<li>Excellent oral and written communication – fluency in English is a key requirement.</li>\n</ul>\n<ul>\n<li>Capacity to meet deadlines while maintaining quality standards and effective time management.</li>\n</ul>\n<ul>\n<li>Ability to thrive in a fast-paced environment within a high growth business.</li>\n</ul>\n<ul>\n<li>Strong knowledge and experience with IFRS.</li>\n</ul>\n<p>Expert knowledge of Canadian securities regulatory reporting requirements, including NI 31-103, Form 31-103F1, and familiarity with CIRO capital adequacy rules, is highly preferred.</p>\n<ul>\n<li>Strong communication skills, both written and verbal.</li>\n</ul>\n<ul>\n<li>Experience interacting directly with securities regulators (e.g., OSC, CIRO/IIROC, provincial securities commissions) or supporting regulatory examinations and inquiries.</li>\n</ul>\n<ul>\n<li>Demonstrates the ability to responsibly use generative AI tools and copilots in daily workflows, continuously learn as tools evolve, and apply human-in-the-loop practices to deliver business-ready outputs and drive measurable improvements in efficiency, cost, and quality.</li>\n</ul>\n<p>Nice to haves:</p>\n<ul>\n<li>Experience working in Netsuite, FloQast and Google Suite.</li>\n</ul>\n<ul>\n<li>Experience with CIRO Form 1 or similar dealer capital adequacy filings.</li>\n</ul>\n<ul>\n<li>Basic knowledge of digital assets and the crypto economy, or a desire to learn.</li>\n</ul>\n<ul>\n<li>Basic knowledge of SQL and experience working with Snowflake, Looker or similar analytics tools.</li>\n</ul>\n<p>Position ID: P75564 #LI-Remote</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_dceb42f0-405","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Coinbase","sameAs":"https://www.coinbase.com/","logo":"https://logos.yubhub.co/coinbase.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/coinbase/jobs/7528144","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$170,000-$170,000 CAD","x-skills-required":["regulatory reporting","finance","IFRS","Canadian securities regulatory reporting requirements","NI 31-103","Form 31-103F1","CIRO capital adequacy rules","CPA","project management","oral and written communication","time management","fast-paced environment","high growth business"],"x-skills-preferred":["Netsuite","FloQast","Google Suite","CIRO Form 1","digital assets","crypto economy","SQL","Snowflake","Looker"],"datePosted":"2026-04-18T15:48:10.745Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - Canada"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"regulatory reporting, finance, IFRS, Canadian securities regulatory reporting requirements, NI 31-103, Form 31-103F1, CIRO capital adequacy rules, CPA, project management, oral and written communication, time management, fast-paced environment, high growth business, Netsuite, FloQast, Google Suite, CIRO Form 1, digital assets, crypto economy, SQL, Snowflake, Looker","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":170000,"maxValue":170000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_cc8eb5bc-349"},"title":"Staff Data Analyst","description":"<p>As a Staff Data Analyst at Okta, you will play a key role in creating the foundation for data-based decision-making across the company&#39;s functional business teams. You will work with internal customers to identify ways to effectively leverage data using cutting-edge cloud and big data technologies to drive business insights.</p>\n<p>Your responsibilities will include serving as the definitive SME for Customer First analytics, defining the data models, metrics, and value drivers that steer company strategy. You will also lead the exploration and integration of AI-driven tools to automate workflows and pioneer new methodologies for data discovery.</p>\n<p>Additionally, you will architect the analytics strategy for customer insights, leveraging product telemetry and public data to identify and predict risk and growth signals. You will own the vision for our semantic layer, ensuring it supports advanced modeling, self-service, and high-integrity dashboarding.</p>\n<p>You will participate in high-impact initiatives in predictive modeling (cross-sell, churn, LTV) that directly influence GTM execution. You will partner with leadership and account teams to translate raw insights into a high-impact, action-oriented Command Center, empowering account teams to instantly prioritize and execute on the most urgent opportunities and risks.</p>\n<p>You will also partner with Engineering to co-design data pipelines and transformations that ensure long-term scalability and data quality. You will set the bar for excellence in data storytelling and modeling, mentoring the broader team on best practices and process improvement.</p>\n<p>To succeed in this role, you will need to have a passion for driving decisions and insights through data. You will be detail-oriented, analytical, and able to solve big problems. You will also need to be able to effectively communicate with team members and business partners.</p>\n<p>In terms of qualifications, you will need to have a BS in CS, MIS, or a related technical degree. You will also need to have 7+ years of experience as a Data Analyst/Data Engineer/BI Developer. Advanced SQL experience is also required.</p>\n<p>Preferred qualifications include experience with building reports and visualizations to represent data intuitively in Tableau or similar data visualization tools. Advanced analytics, data science, AI/ML experience and techniques are also a plus.</p>\n<p>Finally, you will need to be able to work cross-functionally and communicate with technical and non-technical teams. Experience with ETL processes, software development, and lifecycle awareness, using Python, Java, Databricks, and Snowflake is also a plus.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_cc8eb5bc-349","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7792010","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Data Analysis","Data Visualization","Tableau","Python","Java","Databricks","Snowflake"],"x-skills-preferred":["Advanced Analytics","Data Science","AI/ML","ETL Processes","Software Development","Lifecycle Awareness"],"datePosted":"2026-04-18T15:48:00.277Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Data Analysis, Data Visualization, Tableau, Python, Java, Databricks, Snowflake, Advanced Analytics, Data Science, AI/ML, ETL Processes, Software Development, Lifecycle Awareness"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4b4378c3-f92"},"title":"Principal Software Engineer","description":"<p>We&#39;re looking for a Principal Software Engineer to join our Advertising, Company Intelligence, and Intent team. As a key member of our engineering team, you&#39;ll design and implement the core systems that power our real-time marketing platform.</p>\n<p>Your responsibilities will include:</p>\n<ul>\n<li>Designing and building distributed systems that process, enrich, and respond to billions of behavioral events per day in real time</li>\n<li>Developing high-performance APIs and services that support advertising, identity, and intent features across the Marketing Platform</li>\n<li>Leveraging machine learning and large language models (LLMs) to analyze behavioral data, classify content, extract signals, and enable intelligent decision-making</li>\n<li>Building intelligent agents using frameworks like LangGraph or MCP to reason over data and power user-facing insights</li>\n<li>Designing and operating data pipelines using tools like Kafka, Kinesis, and ClickHouse to support both streaming and batch workloads</li>\n<li>Driving quality, performance, scalability, and observability across all systems you own</li>\n<li>Collaborating cross-functionally with product managers, data scientists, and engineers to deliver customer-facing features and internal tooling</li>\n<li>Contributing to technical leadership and mentorship of teammates</li>\n</ul>\n<p>We&#39;re looking for someone with 8+ years of backend, data, or infrastructure engineering experience, or equivalent impact and leadership. You should have strong experience in at least one of the following areas:</p>\n<ul>\n<li>Distributed systems engineering</li>\n<li>Big data infrastructure</li>\n<li>Applied AI/ML</li>\n</ul>\n<p>You should also be proficient in one or more core languages (Java, Go, Python), have a solid grasp of SQL and large-scale data modeling, and familiarity with databases and tools such as ClickHouse, DynamoDB, Bigtable, Memcached, Kafka, Kinesis, Firehose, Airflow, Snowflake.</p>\n<p>Bonus points if you have experience in ad tech, real-time bidding (RTB), or programmatic systems, background in identity resolution, attribution, or behavioral analytics at scale, contributions to open source in ML, infrastructure, or data tooling, or strong product instincts and a passion for building tools that drive meaningful outcomes.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4b4378c3-f92","directApply":true,"hiringOrganization":{"@type":"Organization","name":"ZoomInfo","sameAs":"https://www.zoominfo.com/","logo":"https://logos.yubhub.co/zoominfo.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/zoominfo/jobs/8340521002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$163,800-$257,400 USD","x-skills-required":["Distributed systems engineering","Big data infrastructure","Applied AI/ML","Java","Go","Python","SQL","ClickHouse","DynamoDB","Bigtable","Memcached","Kafka","Kinesis","Firehose","Airflow","Snowflake"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:47:17.745Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bethesda, Maryland, United States; Remote US - PST; Waltham, Massachusetts, United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Distributed systems engineering, Big data infrastructure, Applied AI/ML, Java, Go, Python, SQL, ClickHouse, DynamoDB, Bigtable, Memcached, Kafka, Kinesis, Firehose, Airflow, Snowflake","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":163800,"maxValue":257400,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a27a43f9-673"},"title":"Senior Data Analyst, Enterprise Analytics","description":"<p>As a Senior Data Analyst on GitLab&#39;s Enterprise Analytics team, you&#39;ll support some of GitLab&#39;s most visible go-to-market and executive reporting. You&#39;ll work closely with Sales, Marketing, Revenue Operations, Finance, and analytics partners to deliver company-level reports, go-to-market performance views, and lifecycle reporting that leaders use to run the business.</p>\n<p>Working in Snowflake, dbt, Tableau, and VS Code, you&#39;ll turn ambiguous business questions into trusted, well-documented data products that serve as a single source of truth for performance and targets versus actuals. You&#39;ll also help improve our Enterprise Analytics handbook and core data foundations so strategy, processes, and metric definitions are clear and usable in our all-remote, values-driven environment.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Build and maintain executive-facing scorecards, go-to-market performance views, and new-customer reporting that connect pipeline, bookings, and product usage signals into targets-versus-actuals tracking by motion.</li>\n</ul>\n<ul>\n<li>Design performant, reusable Tableau Cloud data sources and help shape the underlying dbt models so reporting layers are stable, governed, and aligned to single-source-of-truth patterns.</li>\n</ul>\n<ul>\n<li>Collaborate with Analytics Engineering and Data Engineering to improve dbt models that support reliable, scalable reporting for business stakeholders.</li>\n</ul>\n<ul>\n<li>Document metric logic, data lineage, and Tableau usage patterns in the handbook so stakeholders can understand how data products are built and used.</li>\n</ul>\n<ul>\n<li>Implement and monitor data quality checks and reconciliations across Snowflake, Salesforce, and other go-to-market systems to strengthen trust in company-level reporting.</li>\n</ul>\n<ul>\n<li>Partner with Revenue Operations, Finance, and go-to-market analysts and stakeholders to define questions, align on metric definitions and pacing logic, and deliver dashboards and deep-dive analyses.</li>\n</ul>\n<ul>\n<li>Present insights in a clear, actionable way for both operational and executive audiences so teams can make better decisions faster.</li>\n</ul>\n<ul>\n<li>Share best practices in SQL, data visualization, and go-to-market analytics, including code reviews and pairing support on high-priority dashboards.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Strong analytical skills with the ability to frame ambiguous business questions, structure analysis, and translate findings into clear recommendations for go-to-market and executive stakeholders.</li>\n</ul>\n<ul>\n<li>Advanced SQL skills working with large, complex data models, preferably in Snowflake, including joining many tables and building reusable queries and views.</li>\n</ul>\n<ul>\n<li>Proven experience building executive-facing dashboards in Tableau or a similar business intelligence tool, including data source design, performance tuning, and visualization best practices.</li>\n</ul>\n<ul>\n<li>Deep understanding of go-to-market concepts such as new-customer and first-order metrics, pipeline, bookings, Net ARR, go-to-market motions, sales segments, and marketing funnel metrics.</li>\n</ul>\n<ul>\n<li>Strong command of GenAI tools for daily use in your work and the judgment to use them effectively to improve speed and quality.</li>\n</ul>\n<ul>\n<li>Ability to communicate complex analyses in a clear, concise way through presentations, written narratives, and data visualizations for both technical and non-technical audiences.</li>\n</ul>\n<ul>\n<li>Comfort working in an all-remote, asynchronous environment, with a high level of ownership and the ability to drive work forward independently.</li>\n</ul>\n<ul>\n<li>Alignment with GitLab&#39;s values and openness to applying transferable skills from related analytics, revenue operations, or business intelligence roles.</li>\n</ul>\n<ul>\n<li>Experience with dbt and modern analytics engineering patterns, including trusted marts, certified sources, and documented lineage, is helpful but not required.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a27a43f9-673","directApply":true,"hiringOrganization":{"@type":"Organization","name":"GitLab","sameAs":"https://about.gitlab.com/","logo":"https://logos.yubhub.co/about.gitlab.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/gitlab/jobs/8478359002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Tableau","dbt","Snowflake","GenAI","data visualization","business intelligence"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:58.266Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, Bangalore"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Tableau, dbt, Snowflake, GenAI, data visualization, business intelligence"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_19d143c9-cac"},"title":"Data Analytics Engineer","description":"<p>Our mission is to bring web3 to a billion people by providing builders with the tools they need to build exceptional onchain products. As a Data Analytics Engineer, you will be the data layer for the entire company, designing clean, trusted datasets that power our AI tooling and ensuring every team can make decisions from a single source of truth.</p>\n<p>You will build and own the canonical data models in Snowflake that serve as Alchemy&#39;s company-wide source of truth, structure datasets so vendor AI tools perform optimally out of the box, and explore and prototype MCP integrations that let internal teams query data conversationally.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Building and owning the canonical data models in Snowflake that serve as Alchemy&#39;s company-wide source of truth</li>\n<li>Structuring datasets so vendor AI tools perform optimally out of the box</li>\n<li>Exploring and prototyping MCP integrations that let internal teams query data conversationally</li>\n<li>Eliminating shadow tables and one-off datasets by proactively serving team data needs at the platform level</li>\n</ul>\n<p>Requirements include 6+ years in data engineering with strong SQL and deep Snowflake expertise, experience designing efficient, scalable analytical data models, and proficiency with dbt or comparable transformation frameworks.</p>\n<p>Benefits include medical, dental, and vision coverage, gym reimbursement, home office build-out budget, in-office group meals, commuter benefits, flexible time off, wellbeing and mental health perks, learning and development stipend, company-sponsored conferences and events, HSA and FSA plans, and fertility benefits.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_19d143c9-cac","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Alchemy","sameAs":"https://www.alchemy.com/","logo":"https://logos.yubhub.co/alchemy.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/alchemy/jobs/4677021005","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$200,000 - $240,000 annually","x-skills-required":["data engineering","SQL","Snowflake","dbt","MCP","AI tooling"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:55.905Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States, San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data engineering, SQL, Snowflake, dbt, MCP, AI tooling","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":200000,"maxValue":240000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_0a3dc5a7-8d9"},"title":"Senior Analytics Engineer","description":"<p>We are seeking a Senior Analytics Engineer to support the Enterprise by building reliable, well-modeled, and trusted data for reporting, decision-making, and emerging AI use cases.</p>\n<p>As a Senior Analytics Engineer, you will design scalable data models, define consistent business logic, and help establish a strong semantic foundation that enables both human analytics and machine-driven intelligence.</p>\n<p>You will partner closely with Finance, People and Company Operations stakeholders, Data Analysts, and Data Engineers to ensure data is accurate, consistent, and easy to consume; whether through dashboards, self-service exploration, or AI-powered workflows.</p>\n<p>Responsibilities:</p>\n<p>Data Modeling &amp; Semantics</p>\n<ul>\n<li>Design, build, and maintain scalable data models using dbt and Snowflake</li>\n<li>Define and standardize core Finance, HR and Enterprise level metrics (e.g., revenue, ARR, billing, Attrition, Executive Insights, Security) with clear, governed logic</li>\n<li>Establish consistent modeling patterns, naming conventions, and semantic clarity across datasets</li>\n<li>Contribute to a shared semantic layer that supports both analytics and AI use cases</li>\n</ul>\n<p>AI-Ready Data &amp; Snowflake Ecosystem</p>\n<ul>\n<li>Prepare high-quality, well-governed datasets for use with Snowflake Cortex and Snowflake Intelligence</li>\n<li>Enable structured data foundations that support LLM-powered use cases, semantic querying, and intelligent applications</li>\n<li>Ensure data is context-rich, well-documented, and aligned with business meaning to improve AI accuracy and trust</li>\n</ul>\n<p>Data Quality, Governance &amp; Trust</p>\n<ul>\n<li>Implement robust testing, validation, and documentation practices in dbt</li>\n<li>Ensure consistency across reports and dashboards through shared definitions and reusable models</li>\n<li>Apply data governance best practices, including access controls, lineage, and auditability</li>\n<li>Partner across teams to establish clear ownership and accountability for data assets</li>\n</ul>\n<p>Collaboration &amp; Delivery</p>\n<ul>\n<li>Partner with Finance, Analysts, and cross-functional stakeholders to translate business needs into data solutions</li>\n<li>Support self-service analytics by building intuitive, reusable datasets</li>\n<li>Contribute to scalable data workflows that balance immediate business needs with long-term maintainability</li>\n<li>Work within an agile environment, contributing to planning, prioritization, and continuous improvement</li>\n</ul>\n<p>AI and Data Mindset</p>\n<ul>\n<li>Demonstrate an AI-first mindset, thinking beyond data models and dashboards to how data can power intelligent systems and decision-making</li>\n<li>Understand the importance of well-modeled, well-documented, and semantically clear data for AI and LLM-based use cases</li>\n<li>A level of comfort leveraging AI-assisted workflows to improve productivity, code quality, and consistency</li>\n<li>Curiosity for emerging capabilities in platforms like Snowflake Cortex and Snowflake Intelligence, and how they can be applied to Enterprise analytics</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>5–8+ years of experience in Analytics Engineering, Data Engineering, or similar roles</li>\n<li>Strong SQL skills and experience building analytics-ready data models</li>\n<li>Mentorship &amp; Engineering Excellence: Mentorship, raising the technical bar, establishing organization-wide standards for dbt/SQL quality and CI/CD</li>\n<li>Hands-on experience with dbt and Snowflake or other ETL, Modeling and database platforms</li>\n<li>Solid understanding of data modeling principles, including dimensional modeling and semantic design</li>\n<li>Ability to navigate highly ambiguous business challenges, translating vague, complex, or competing goals from executive stakeholders into clear, actionable, and robust data solutions</li>\n<li>Experience translating business requirements into clear, maintainable data logic</li>\n<li>Familiarity with SaaS metrics and Finance and People data (e.g., ARR, revenue recognition, billing, attrition etc.)</li>\n<li>Experience with data quality, testing, and documentation best practices</li>\n<li>Exposure to Python, R, or data processing frameworks (e.g., PySpark) is a plus</li>\n<li>Experience with BI tools such as Tableau or Looker</li>\n<li>Strong communication skills and ability to work across technical and business teams</li>\n</ul>\n<p>What you can look forward to as an Okta employee!</p>\n<ul>\n<li>Amazing Benefits</li>\n<li>Making Social Impact</li>\n<li>Fostering Diversity, Equity, Inclusion and Belonging at Okta</li>\n<li>Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_0a3dc5a7-8d9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7818510","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["dbt","Snowflake","SQL","data modeling","dimensional modeling","semantic design","ETL","data quality","testing","documentation","Python","R","PySpark","Tableau","Looker"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:30.556Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bellevue, Washington; Chicago, Illinois; San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"dbt, Snowflake, SQL, data modeling, dimensional modeling, semantic design, ETL, data quality, testing, documentation, Python, R, PySpark, Tableau, Looker"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_25f010f0-7d1"},"title":"Data Engineer","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Brex’s AI-native automation and world-class service eliminate manual expense and accounting tasks for customers so they can focus on what matters most. Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream. We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Data at Brex</p>\n<p>Our Scientists and Engineers work together to make data , and insights derived from data , a core asset across Brex. But it&#39;s more than just crunching numbers. The Data team at Brex develops infrastructure, statistical models, and products using data. Our work is ingrained in Brex&#39;s decision-making process, the efficiency of our operations, our risk management policies, and the unparalleled experience we provide our customers.</p>\n<p>What You’ll Do</p>\n<p>As a Data Engineer at Brex, you will be a core contributor in transforming raw data into actionable insights for various departments across the organization. You&#39;ll collaborate closely with Data Scientists, Software Engineers, and business units to create efficient data models, pipelines, and analytics frameworks that drive the business forward. You also play a leading role in the design, implementation, and maintenance of Core Data tables, our high-quality, curated data source for a wide range of analytic applications.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our San Francisco office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, build, and maintain data models and pipelines that scale with the growing number of services, products, and changes in the company.</li>\n</ul>\n<ul>\n<li>Collaborate closely with Data Scientists, Data Analysts, and Business teams to understand their data needs, translating them into robust, efficient, scalable data solutions that enable ease of predictive analytics, data analysis, and metrics formulation.</li>\n</ul>\n<ul>\n<li>Maintain data documentation and definitions, building and ensuring that source-of-truth tables remain high quality for data science and reporting applications.</li>\n</ul>\n<ul>\n<li>Develop and enable integration with various data sources, allowing for more data-driven initiatives across the company.</li>\n</ul>\n<ul>\n<li>Apply best practices in data management to ensure the reliability and robustness of data utilized across various analytics applications.</li>\n</ul>\n<ul>\n<li>Set and proliferate company-wide standards for data relating to structure, quality, and expectations.</li>\n</ul>\n<ul>\n<li>Act as a liaison between the technical and non-technical teams, bridging gaps and ensuring that data solutions align with business objectives.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>3+ years of experience in Data Engineering, Data Analytics, or a related field such as Analytics Engineering.</li>\n</ul>\n<ul>\n<li>2+ years of experience working with modern data transformation tools like DBT.</li>\n</ul>\n<ul>\n<li>Advanced knowledge of databases and SQL with the ability to efficiently stage, process, and transform data.</li>\n</ul>\n<ul>\n<li>Experience integrating and orchestrating data workflows with various modern data tools and systems.</li>\n</ul>\n<ul>\n<li>Experience with data modeling, ETL/ELT processes, and data warehousing solutions.</li>\n</ul>\n<ul>\n<li>Experience working with a data warehouse such as Snowflake.</li>\n</ul>\n<ul>\n<li>Experience with a data workflow orchestrator tool such as Airflow.</li>\n</ul>\n<ul>\n<li>Experience with a programming language such as Python.</li>\n</ul>\n<ul>\n<li>Familiarity with BI tools such as Looker, Tableau, or similar platforms is a plus.</li>\n</ul>\n<ul>\n<li>Exceptional quantitative and analytical skills.</li>\n</ul>\n<ul>\n<li>Strong communication skills and ability to collaborate with various stakeholders, both technical and non-technical.</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $120,800 - $151,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_25f010f0-7d1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8366850002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$120,800 - $151,000","x-skills-required":["DBT","databases","SQL","data modeling","ETL/ELT processes","data warehousing solutions","Snowflake","Airflow","Python","BI tools","Looker","Tableau"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:18.514Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"DBT, databases, SQL, data modeling, ETL/ELT processes, data warehousing solutions, Snowflake, Airflow, Python, BI tools, Looker, Tableau","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120800,"maxValue":151000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1d204fa1-067"},"title":"Data Engineer","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Data at Brex</p>\n<p>Our Scientists and Engineers work together to make data , and insights derived from data , a core asset across Brex. But it&#39;s more than just crunching numbers. The Data team at Brex develops infrastructure, statistical models, and products using data. Our work is ingrained in Brex&#39;s decision-making process, the efficiency of our operations, our risk management policies, and the unparalleled experience we provide our customers.</p>\n<p>What You’ll Do</p>\n<p>As a Data Engineer at Brex, you will be a core contributor in transforming raw data into actionable insights for various departments across the organization. You&#39;ll collaborate closely with Data Scientists, Software Engineers, and business units to create efficient data models, pipelines, and analytics frameworks that drive the business forward. You also play a leading role in the design, implementation, and maintenance of Core Data tables, our high-quality, curated data source for a wide range of analytic applications.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our Seattle office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, build, and maintain data models and pipelines that scale with the growing number of services, products, and changes in the company.</li>\n</ul>\n<ul>\n<li>Collaborate closely with Data Scientists, Data Analysts, and Business teams to understand their data needs, translating them into robust, efficient, scalable data solutions that enable ease of predictive analytics, data analysis, and metrics formulation.</li>\n</ul>\n<ul>\n<li>Maintain data documentation and definitions, building and ensuring that source-of-truth tables remain high quality for data science and reporting applications.</li>\n</ul>\n<ul>\n<li>Develop and enable integration with various data sources, allowing for more data-driven initiatives across the company.</li>\n</ul>\n<ul>\n<li>Apply best practices in data management to ensure the reliability and robustness of data utilized across various analytics applications.</li>\n</ul>\n<ul>\n<li>Set and proliferate company-wide standards for data relating to structure, quality, and expectations.</li>\n</ul>\n<ul>\n<li>Act as a liaison between the technical and non-technical teams, bridging gaps and ensuring that data solutions align with business objectives.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>3+ years of experience in Data Engineering, Data Analytics, or a related field such as Analytics Engineering.</li>\n</ul>\n<ul>\n<li>2+ years of experience working with modern data transformation tools like DBT.</li>\n</ul>\n<ul>\n<li>Advanced knowledge of databases and SQL with the ability to efficiently stage, process, and transform data.</li>\n</ul>\n<ul>\n<li>Experience integrating and orchestrating data workflows with various modern data tools and systems.</li>\n</ul>\n<ul>\n<li>Experience with data modeling, ETL/ELT processes, and data warehousing solutions.</li>\n</ul>\n<ul>\n<li>Experience working with a data warehouse such as Snowflake.</li>\n</ul>\n<ul>\n<li>Experience with a data workflow orchestrator tool such as Airflow.</li>\n</ul>\n<ul>\n<li>Experience with a programming language such as Python.</li>\n</ul>\n<ul>\n<li>Familiarity with BI tools such as Looker, Tableau, or similar platforms is a plus.</li>\n</ul>\n<ul>\n<li>Exceptional quantitative and analytical skills.</li>\n</ul>\n<ul>\n<li>Strong communication skills and ability to collaborate with various stakeholders, both technical and non-technical.</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $120,800 - $151,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1d204fa1-067","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8510493002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$120,800 - $151,000","x-skills-required":["DBT","databases","SQL","data modeling","ETL/ELT processes","data warehousing solutions","Snowflake","Airflow","Python","BI tools","Looker","Tableau"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:02.393Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Seattle, Washington, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"DBT, databases, SQL, data modeling, ETL/ELT processes, data warehousing solutions, Snowflake, Airflow, Python, BI tools, Looker, Tableau","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120800,"maxValue":151000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2d538f60-462"},"title":"Staff Software Engineer - ISPM","description":"<p>Secure Every Identity, from AI to Human</p>\n<p>Identity is the key to unlocking the potential of AI. Okta secures AI by building the trusted, neutral infrastructure that enables organisations to safely embrace this new era. This work requires a relentless drive to solve complex challenges with real-world stakes. We are looking for builders and owners who operate with speed and urgency and execute with excellence.</p>\n<p>This is an opportunity to do career-defining work. We&#39;re all in on this mission. If you are too, let&#39;s talk.</p>\n<p><strong>The Identity Security Posture Management (ISPM) Team</strong></p>\n<p>At ISPM (previously known as Spera), we are revolutionising the field of identity security, providing comprehensive solutions to address the challenges faced by modern organisations. Our platform provides security leaders with end-to-end contextual visibility of their identity landscape, reducing risks, enhancing operational efficiency, and ensuring compliance.</p>\n<p>With an exceptional team dedicated to pushing boundaries and achieving remarkable results, we are seeking a talented engineer to join us in shaping the future of identity security.</p>\n<p>This role requires working from the office four times a week.</p>\n<p><strong>What you’ll be doing</strong></p>\n<p>As a Staff Software Engineer at ISPM, you will play a critical role in building our platform. Your expertise and contributions will have a direct impact on the success and effectiveness of our product. Specifically, your responsibilities will include:</p>\n<ul>\n<li>Lead, design, architect and build high-quality scalable software by introducing best practices around software engineering architecture and processes</li>\n</ul>\n<ul>\n<li>Raise the bar on engineering excellence by improving standard methodologies, producing best-in-class code, documentation, testing, and monitoring</li>\n</ul>\n<ul>\n<li>Collaborate with product and R&amp;D engineers to translate product requirements into technical solutions.</li>\n</ul>\n<ul>\n<li>Identifying and resolving technical issues and performance bottlenecks.</li>\n</ul>\n<ul>\n<li>Staying up to date with the latest industry trends and technologies to continuously improve our products.</li>\n</ul>\n<ul>\n<li>Be a mentor for colleagues and help promote knowledge-sharing</li>\n</ul>\n<p><strong>What you’ll bring to the role</strong></p>\n<ul>\n<li>8+ years of experience as a software engineer, preferably in a fast-paced environment.</li>\n</ul>\n<ul>\n<li>5+ years of programming experience with Python, specifically focused on backend development.</li>\n</ul>\n<ul>\n<li>A proven track record of successfully building and delivering large-scale systems.</li>\n</ul>\n<ul>\n<li>Solid understanding of software engineering principles, design patterns, and best practices.</li>\n</ul>\n<ul>\n<li>Excellent problem-solving skills and a detail-oriented mindset.</li>\n</ul>\n<ul>\n<li>Strong communication and collaboration abilities to work effectively within a team.</li>\n</ul>\n<p><strong>Extra credit if you have experience in any of the following</strong></p>\n<ul>\n<li>Experience with Snowflake or other data warehousing platforms.</li>\n</ul>\n<ul>\n<li>Experience with Amazon Web Services and modern cloud stack.</li>\n</ul>\n<p>The Okta Experience</p>\n<ul>\n<li>Supporting Your Well-Being</li>\n</ul>\n<ul>\n<li>Driving Social Impact</li>\n</ul>\n<ul>\n<li>Developing Talent and Fostering Connection + Community</li>\n</ul>\n<p>We are intentional about connection. Our global community, spanning over 20 offices worldwide, is united by a drive to innovate. Your journey begins with an immersive, in-person onboarding experience designed to accelerate your impact and connect you to our mission and team from day one.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2d538f60-462","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7462926","x-work-arrangement":"hybrid","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","Backend Development","Software Engineering Principles","Design Patterns","Best Practices","Problem-Solving Skills","Detail-Oriented Mindset","Communication and Collaboration Abilities"],"x-skills-preferred":["Snowflake","Amazon Web Services","Modern Cloud Stack"],"datePosted":"2026-04-18T15:45:13.300Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Tel Aviv, Israel"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Backend Development, Software Engineering Principles, Design Patterns, Best Practices, Problem-Solving Skills, Detail-Oriented Mindset, Communication and Collaboration Abilities, Snowflake, Amazon Web Services, Modern Cloud Stack"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c53ecdd3-dc7"},"title":"Scale Solution Engineer","description":"<p>As a Scale Solution Engineer at Databricks, you will play a critical role in advising customers during their onboarding process. You will work directly with customers to help them onboard and deploy Databricks in their production environment.</p>\n<p>Your impact will be significant, ensuring new customers have an excellent experience by providing technical assistance early in their journey. You will become an expert on the Databricks Platform and guide customers in making the best technical decisions. You will also work directly with multiple customers concurrently to provide technical solutions.</p>\n<p>To succeed in this role, you will need:</p>\n<ul>\n<li>An undergraduate degree or higher in Computer Science, Information Systems, or relevant experience</li>\n<li>1+ years experience in a technical role, preferably in the data or cloud field</li>\n<li>Knowledge of at least one of the public cloud platforms AWS, Azure, or GCP</li>\n<li>Knowledge of a programming language such as Python, Scala, or SQL</li>\n<li>Knowledge of end-to-end data analytics workflow</li>\n<li>Hands-on professional or academic experience in one or more of the following: Data Engineering technologies (e.g., ETL, DBT, Spark, Airflow), Data Warehousing technologies (e.g., SQL, Stored Procedures, Redshift, Snowflake)</li>\n<li>Excellent time management and prioritization skills</li>\n<li>Excellent written and verbal communication</li>\n</ul>\n<p>Bonus: Knowledge of Data Science and Machine Learning (e.g., build and deploy ML Models)</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c53ecdd3-dc7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8408817002","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["public cloud platforms","AWS","Azure","GCP","Python","Scala","SQL","Data Engineering technologies","ETL","DBT","Spark","Airflow","Data Warehousing technologies","Stored Procedures","Redshift","Snowflake"],"x-skills-preferred":["Data Science","Machine Learning"],"datePosted":"2026-04-18T15:44:58.601Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Rica"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"public cloud platforms, AWS, Azure, GCP, Python, Scala, SQL, Data Engineering technologies, ETL, DBT, Spark, Airflow, Data Warehousing technologies, Stored Procedures, Redshift, Snowflake, Data Science, Machine Learning"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_25a942db-90e"},"title":"Senior Data Analyst (Auth0)","description":"<p>Secure Every Identity, from AI to Human</p>\n<p>Identity is the key to unlocking the potential of AI. Okta secures AI by building the trusted, neutral infrastructure that enables organisations to safely embrace this new era. This work requires a relentless drive to solve complex challenges with real-world stakes. We are looking for builders and owners who operate with speed and urgency and execute with excellence.</p>\n<p>This is an opportunity to do career-defining work. We&#39;re all in on this mission. If you are too, let&#39;s talk.</p>\n<p><strong>What is the Developer Led Growth (DLG) Team?</strong></p>\n<p>At the heart of our mission is a simple goal: to make choosing and integrating authentication a delightful experience for every developer.</p>\n<p>We’re the DLG team, essentially Product Led Growth (PLG), but for developers using Auth0. We are a high-impact, multidisciplinary engine that includes Developer Relations, Developer Marketing, our Startup Program, Product Activation and Conversion. We don&#39;t just move metrics, we build the journey that turns curious developers into lifelong advocates.</p>\n<p><strong>Why This Role?</strong></p>\n<ul>\n<li>Great Visibility: You will provide the analytical backbone for the entire DLG organisation, turning cross-functional data into a unified narrative.</li>\n</ul>\n<ul>\n<li>Autonomy: We trust you to own your domain. This isn&#39;t a role for a cog in the machine, you’ll have the freedom to identify opportunities and drive the strategy.</li>\n</ul>\n<ul>\n<li>True Variety: From analysing startup ecosystem trends to optimising conversion funnels, no two weeks will look the same.</li>\n</ul>\n<ul>\n<li>Deep Collaboration: You’ll sit at the intersection of product, marketing, and community, working with stakeholders across the entire company to fuel our growth.</li>\n</ul>\n<p>If you’re a curious analyst who thrives in a fast-paced, developer first environment and wants to see their work directly influence how the world’s engineers build, we’d love to meet you.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Build and maintain Tableau dashboards and reports related to the DLG business</li>\n</ul>\n<ul>\n<li>Build and maintain DBT models and custom SQL queries to analyse DLG business</li>\n</ul>\n<ul>\n<li>Investigate and understand discrepancies in the data</li>\n</ul>\n<ul>\n<li>Help DLG automate and scale processes, find areas of opportunity for process improvements.</li>\n</ul>\n<ul>\n<li>Analyse A/B experiments and DLG initiatives.</li>\n</ul>\n<ul>\n<li>Collaborate with sales, ops, marketing, finance and product to understand their challenges, dive into the data and share insights that solve business problems.</li>\n</ul>\n<p><strong>Qualifications</strong></p>\n<ul>\n<li>+5 years of experience working in data analytics-related fields.</li>\n</ul>\n<ul>\n<li>Bachelor’s degree in math, statistics, computer science, economics or relevant field.</li>\n</ul>\n<ul>\n<li>Strong proficiency in SQL and data modelling: cleaning, modelling and transforming large and complex datasets into usable and trusted models.</li>\n</ul>\n<ul>\n<li>Familiarity with DBT (data build tool), Git and Snowflake data warehouse is preferred.</li>\n</ul>\n<ul>\n<li>Working knowledge of visualisation tools, ideally Tableau, and Google Sheets.</li>\n</ul>\n<ul>\n<li>Ability to analyse data discrepancies and troubleshoot data issues.</li>\n</ul>\n<ul>\n<li>Excellent problem-solving skills with the ability to think critically and translate complex data into actionable insights.</li>\n</ul>\n<ul>\n<li>Familiar with AI tooling, e.g. Copilot, Clay.</li>\n</ul>\n<ul>\n<li>Excellent verbal and written communication skills, with the ability to collaborate with cross-functional teams including technical teams (data engineering) and business users (marketing, product, revenue, etc).</li>\n</ul>\n<ul>\n<li>Stakeholder management and prioritisation skills</li>\n</ul>\n<p>Nice to have</p>\n<ul>\n<li>PLG experience</li>\n</ul>\n<ul>\n<li>Experience building and working with AI Agents</li>\n</ul>\n<ul>\n<li>Translating complex developer behaviour into product insights</li>\n</ul>\n<ul>\n<li>AB testing experimentation experience</li>\n</ul>\n<ul>\n<li>Revenue modelling and forecasting knowledge</li>\n</ul>\n<p>The annual base salary range for this position for candidates located in California (excluding San Francisco Bay Area), Colorado, Illinois, New York, and Washington is between: $114,000-$156,200 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_25a942db-90e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7683013","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$114,000-$156,200 USD","x-skills-required":["SQL","DBT","Git","Snowflake data warehouse","Tableau","Google Sheets","AI tooling","Copilot","Clay"],"x-skills-preferred":["PLG experience","Experience building and working with AI Agents","Translating complex developer behaviour into product insights","AB testing experimentation experience","Revenue modelling and forecasting knowledge"],"datePosted":"2026-04-18T15:44:01.511Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Chicago, Illinois; New York, New York; Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, DBT, Git, Snowflake data warehouse, Tableau, Google Sheets, AI tooling, Copilot, Clay, PLG experience, Experience building and working with AI Agents, Translating complex developer behaviour into product insights, AB testing experimentation experience, Revenue modelling and forecasting knowledge","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":114000,"maxValue":156200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e500a909-0d2"},"title":"Senior Data Analyst","description":"<p>As a Senior Data Analyst, you will play a crucial role in our data-driven decision-making process. You will be responsible for turning raw data into actionable insights that will shape our product strategy and drive business growth.</p>\n<p>This role requires a deep understanding of product analytics, a strong technical skillset, and a forward-thinking mindset to leverage AI in your analysis.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design and build insightful and user-friendly dashboards and reports in Tableau to track key product metrics and performance indicators.</li>\n<li>Utilize Snowflake and dbt to build and maintain robust and scalable data models and pipelines for our analytics needs.</li>\n<li>Conduct in-depth product analysis to identify trends, patterns, and opportunities for product improvement and growth.</li>\n<li>Write complex SQL queries and use Python to perform advanced data analysis.</li>\n<li>Collaborate with product managers, engineers, and other stakeholders to understand their data needs and provide them with the insights they need to make informed decisions.</li>\n<li>Proactively identify and explore new ways to leverage AI and machine learning to enhance our analytical capabilities and unlock new insights from our data.</li>\n<li>Communicate your findings and recommendations effectively to both technical and non-technical audiences.</li>\n</ul>\n<p>Qualifications:</p>\n<ul>\n<li>Proven experience as a Data Analyst or in a similar role, with a focus on product analytics.</li>\n<li>3-5 years of experience.</li>\n<li>Expert-level proficiency in SQL.</li>\n<li>Strong experience with data visualization tools, particularly Tableau.</li>\n<li>Hands-on experience with cloud data warehouses like Snowflake and data transformation tools like dbt.</li>\n<li>Advanced analytics, data science, AI/ML experience and techniques are a plus.</li>\n<li>A strong analytical mindset with the ability to think critically and solve complex problems.</li>\n<li>A proactive and curious mindset with a strong desire to learn and experiment with new technologies, especially in the realm of AI.</li>\n<li>Excellent communication and collaboration skills.</li>\n</ul>\n<p>Added Advantage:</p>\n<ul>\n<li>Experience with other BI tools like Looker.</li>\n<li>Experience with statistical analysis and machine learning.</li>\n<li>Familiarity with product analytics platforms like Amplitude or Mixpanel</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e500a909-0d2","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7731263","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Tableau","Snowflake","dbt","Python","Data Analysis","Data Visualization","Cloud Data Warehouses","Data Transformation"],"x-skills-preferred":["Advanced Analytics","Data Science","AI/ML","Statistical Analysis","Machine Learning","Product Analytics Platforms"],"datePosted":"2026-04-18T15:43:56.710Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Tableau, Snowflake, dbt, Python, Data Analysis, Data Visualization, Cloud Data Warehouses, Data Transformation, Advanced Analytics, Data Science, AI/ML, Statistical Analysis, Machine Learning, Product Analytics Platforms"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_93c1356c-a95"},"title":"Principal Software Engineer, Web Data - Tech Lead","description":"<p>We&#39;re looking for an exceptional Principal Software Engineer to serve as the de facto Technical Lead for our Web Data Acquisition (WDA) team. This is a highly visible, hands-on technical leadership role where you&#39;ll own the architectural direction for crawling systems, evolve and unify crawling platforms into a best-in-class stack, and elevate a high-performing engineering team.</p>\n<p>As a Principal Software Engineer, you&#39;ll solve complex distributed systems challenges, build modular tooling that accelerates delivery, and set the standard for observability and operational excellence. You&#39;ll have a dedicated manager handling all HR and administrative responsibilities. A product manager connects business needs with technical work. Your focus is 100% technical leadership, mentorship, and hands-on execution.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Technical Leadership &amp; System Design: Proven experience building web crawling or large-scale data systems from scratch. Strong architectural skills designing scalable, fault-tolerant distributed systems. Track record leading complex technical initiatives and driving architecture direction for teams.</li>\n</ul>\n<ul>\n<li>Data Engineering Expertise: Deep background in large-scale data engineering (terabytes daily). Hands-on experience with cloud data warehouses (BigQuery, Snowflake). Experience with Apache Kafka, Kubernetes (GKE/EKS), and orchestration tools (Airflow).</li>\n</ul>\n<ul>\n<li>Web Crawling &amp; Data Extraction: Deep expertise in web crawling technologies and advanced scraping (Scrapy or similar). Experience extracting structured/unstructured web data and SERP extraction. Knowledge of proxy infrastructure management, anti-bot detection, and ethical crawling.</li>\n</ul>\n<ul>\n<li>Leadership &amp; Team Development: Experience mentoring engineers at all levels and fostering collaborative culture. Strong ability to influence technical direction and establish best practices. Track record hiring, coaching, and developing senior engineers.</li>\n</ul>\n<p>Ideal Candidate Profile:</p>\n<ul>\n<li>10+ years software engineering experience. 5+ years focused on data engineering. 3+ years in senior/principal-level technical leadership.</li>\n</ul>\n<ul>\n<li>Strong CS fundamentals (algorithms, data structures, distributed systems). Self-starter who thrives in fast-paced environments.</li>\n</ul>\n<p>Core Technical Stack:</p>\n<ul>\n<li>Python &amp; Java</li>\n<li>Apache Kafka</li>\n<li>GCP (BigQuery, GKE, Vertex AI)</li>\n<li>Snowflake &amp; Starburst/Trino</li>\n<li>Terraform</li>\n<li>Scrapy / Web Scraping Frameworks</li>\n<li>Proxy Management Systems</li>\n<li>Distributed Systems &amp; Kubernetes</li>\n<li>Apache Airflow</li>\n<li>Large-Scale ETL Pipelines</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_93c1356c-a95","directApply":true,"hiringOrganization":{"@type":"Organization","name":"ZoomInfo","sameAs":"https://www.zoominfo.com/","logo":"https://logos.yubhub.co/zoominfo.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/zoominfo/jobs/8378092002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$163,800-$257,400 USD","x-skills-required":["Python","Java","Apache Kafka","Kubernetes","GCP","Snowflake","Terraform","Scrapy","Proxy Management Systems","Distributed Systems","Apache Airflow","Large-Scale ETL Pipelines"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:43:50.896Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Java, Apache Kafka, Kubernetes, GCP, Snowflake, Terraform, Scrapy, Proxy Management Systems, Distributed Systems, Apache Airflow, Large-Scale ETL Pipelines","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":163800,"maxValue":257400,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3bf08e11-5e7"},"title":"Integrations Support Specialist II","description":"<p>ZoomInfo is where careers accelerate. We move fast, think boldly, and empower you to do the best work of your life. You&#39;ll be surrounded by teammates who care deeply, challenge each other, and celebrate wins. With tools that amplify your impact and a culture that backs your ambition, you won&#39;t just contribute. You&#39;ll make things happen–fast.</p>\n<p>About the Role:</p>\n<p>We&#39;re looking for an experienced, results-oriented Integration Support Specialist who excels in communication, presence, and confidence. This role is an integral part of our strategy to ensure every customer is successful. You will work with our customers to build relationships and drive value based on customer-defined goals, and will be responsible for delivering exceptional customer experiences through friendly, efficient, accurate service and quick resolution of customer incidents and inquiries.</p>\n<p>Shift:</p>\n<p>Flexibility to work in EST &amp; PST Hybrid work set up:</p>\n<p>3 days in office and 2 days work from home</p>\n<p>What You&#39;ll Do:</p>\n<p>Provide day-to-day support for our growing customer base, including both incident management and providing workflow recommendations to ensure customers can get the most out of our platform and its integrations.</p>\n<p>Effectively triage and manage escalations to engineering teams for issues that can’t be resolved</p>\n<p>Document best practices and other useful information to better enable our customers through our online support tools</p>\n<p>Serve as the voice of the customer by ensuring customer feedback is clearly captured and conveyed internally to enable ongoing improvement of ZoomInfo products and services</p>\n<p>Learn third-party products and their integrations to educate and guide customers on usage and product adoption</p>\n<p>Identify renewal risks and up-sell opportunities, collaborate with internal teams to remediate issues, ensuring a high level of customer satisfaction enabling a successful renewal</p>\n<p>Other related duties as assigned</p>\n<p>What You Bring:</p>\n<p>Bachelor’s degree Mandatory and 2-3 years of equivalent work experience in SaaS organization</p>\n<p>Adaptable to Night shift (EST / PST)</p>\n<p>Technical CRM, MAT, or Sales Acceleration platform experience; Certification with Salesforce, Marketo, HubSpot, Eloqua and/or Microsoft Dynamics is a strong plus</p>\n<p>Experience with SOQL (Salesforce), and troubleshooting Salesforce Managed Packages</p>\n<p>Familiarity with troubleshooting or interacting with API&#39;s</p>\n<p>Prior experience using DataDog, Jira, and Snowflake</p>\n<p>Prior experience with video conferencing applications</p>\n<p>Proven ability to multi-task and successfully manage multiple priorities simultaneously</p>\n<p>Must have a strong attention to detail and be a self-directed problem solver</p>\n<p>Ability to adapt and pivot in a fast paced, ever-changing environment</p>\n<p>Excellent customer service skills and the ability to be empathetic, accurate, compassionate, responsive, resourceful, and conscientious</p>\n<p>A strong sense of urgency</p>\n<p>Ability to empower end-users to support themselves using our online training resources</p>\n<p>Excellent organisational, written and oral communication skills – You must be able to convey technical jargon in a wide-array of syntax from beginner-level users to developers</p>\n<p>Ability to evaluate, troubleshoot, and follow-up on customer issues as well as replicate and document for further escalation</p>\n<p>A desire and aptitude to learn and understand technical infrastructure</p>\n<p>A positive attitude</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3bf08e11-5e7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"ZoomInfo","sameAs":"https://www.zoominfo.com/","logo":"https://logos.yubhub.co/zoominfo.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/zoominfo/jobs/8453263002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Technical CRM","MAT","Sales Acceleration platform","SOQL","Salesforce Managed Packages","APIs","DataDog","Jira","Snowflake","video conferencing applications"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:43:21.963Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru, Karnataka, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Technical CRM, MAT, Sales Acceleration platform, SOQL, Salesforce Managed Packages, APIs, DataDog, Jira, Snowflake, video conferencing applications"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9ce3bb01-4a1"},"title":"Scale Solutions Engineer","description":"<p>At Databricks, we aim to empower our customers to solve the world&#39;s most challenging data problems using the Data Intelligence platform. As a Scale Solution Engineer, you will be critical in advising customers during their onboarding. You will work directly with customers to help them onboard and deploy Databricks in their production environment and accelerate Databricks features adoption.</p>\n<p>The impact you will have:</p>\n<ul>\n<li>Ensure new customers have an excellent experience by providing technical assistance early in their journey</li>\n<li>Become an expert on the Databricks Platform and guide customers in making the best technical decisions</li>\n<li>Work directly with multiple customers concurrently to provide technical solutions</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>Undergraduate degree or higher in Computer Science, Information Systems, or relevant experience</li>\n<li>3+ years experience in a customer-facing technical role in pre-sales, professional services, consulting or customer success</li>\n<li>Experience in one or more of the following:</li>\n</ul>\n<ul>\n<li>Solid understanding of the end-to-end data analytics workflow</li>\n<li>Excellent time management and prioritization skills</li>\n<li>Knowledge of public cloud platforms AWS, Azure or GCP would be a plus</li>\n<li>Knowledge of a programming language - Python, Scala, or SQL</li>\n<li>Knowledge of end-to-end data analytics workflow</li>\n<li>Hands-on professional or academic experience in one or more of the following:</li>\n</ul>\n<ul>\n<li>Data Engineering technologies (e.g., ETL, DBT, Spark, Airflow)</li>\n<li>Data Warehousing technologies (e.g., SQL, Stored Procedures, Redshift, Snowflake)</li>\n<li>Excellent written and verbal communication, in English and Portuguese</li>\n<li>Bonus - Knowledge of Data Science and Machine Learning (e.g., build and deploy ML Models).</li>\n<li>Databricks certification(s)</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9ce3bb01-4a1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8391865002","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Databricks","Data Engineering","Data Warehousing","Python","Scala","SQL","AWS","Azure","GCP","ETL","DBT","Spark","Airflow","Redshift","Snowflake","English","Portuguese"],"x-skills-preferred":["Data Science","Machine Learning"],"datePosted":"2026-04-18T15:43:14.531Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Sao Paulo, Brazil"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Databricks, Data Engineering, Data Warehousing, Python, Scala, SQL, AWS, Azure, GCP, ETL, DBT, Spark, Airflow, Redshift, Snowflake, English, Portuguese, Data Science, Machine Learning"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6ac59b11-215"},"title":"Engineering Manager, Onboarding","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>\n<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>\n<p>What you’ll do</p>\n<p>You will lead an engineering team focused on building the systems and product experiences that power customer activation at Brex, including onboarding, account setup, verifications, and integrations workflows that help customers realize value quickly.</p>\n<p>This role requires strategic thinking, operational excellence, technical leadership, and a deep passion for delivering frictionless, AI-enhanced customer journeys.</p>\n<p>The ideal candidate is an engineering leader with experience scaling user-facing onboarding systems, delivering high-quality product experiences, and partnering deeply across Product, Design, Operations, and GTM teams.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Take an active role in driving business and product strategies, championing a seamless, intuitive, and efficient onboarding experience.</li>\n</ul>\n<ul>\n<li>Collaborate with cross-functional partners across Product, Design, Operations, and Sales to define priorities and deliver delightful customer activation experiences.</li>\n</ul>\n<ul>\n<li>Leverage AI to reimagine and automate onboarding and implementation workflows, improving speed, personalization, and operational leverage.</li>\n</ul>\n<ul>\n<li>Drive execution of the Onboarding roadmap, ensuring timely, high-quality delivery of systems and features that help customers activate and realize value.</li>\n</ul>\n<ul>\n<li>Lead and manage a team of engineers, including hiring, mentoring, performance management, and establishing strong technical direction.</li>\n</ul>\n<ul>\n<li>Drive continuous improvement in engineering processes, technical architecture, and product quality.</li>\n</ul>\n<ul>\n<li>Foster a culture of innovation, collaboration, accountability, and customer obsession across the team.</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.</li>\n</ul>\n<ul>\n<li>6+ years of software engineering experience with strong technical depth.</li>\n</ul>\n<ul>\n<li>3+ years of experience managing or leading engineers in a high-growth environment.</li>\n</ul>\n<ul>\n<li>Strong technical background and understanding of software development principles.</li>\n</ul>\n<ul>\n<li>Expertise leading full-stack engineering teams delivering end-to-end product experiences.</li>\n</ul>\n<ul>\n<li>Regularly works with cross-functional partners (e.g. Product, Design, Operations, Sales) and excels in driving alignment across stakeholders.</li>\n</ul>\n<ul>\n<li>Data-driven mindset with the ability to evaluate impact, measure funnel performance, and optimize activation metrics.</li>\n</ul>\n<ul>\n<li>Track record building AI-powered product experiences, including LLM-driven automation and personalization.</li>\n</ul>\n<p>Bonus points</p>\n<ul>\n<li>Experience with data platforms such as Snowflake, Hex, or similar.</li>\n</ul>\n<ul>\n<li>Experience building systems related to onboarding, implementation, identity, workflow automation, customer lifecycle products, or other customer facing experiences.</li>\n</ul>\n<ul>\n<li>You have started your own technology venture or were an early technical founder/employee.</li>\n</ul>\n<p>Experience Level: senior Employment Type: full-time Workplace Type: hybrid Category: Engineering Industry: Technology Salary Range: $240,000 CAD - $300,000 CAD Required Skills:</p>\n<ul>\n<li>Software engineering experience</li>\n<li>Technical leadership</li>\n<li>AI-powered product experiences</li>\n<li>Data-driven mindset</li>\n<li>Cross-functional collaboration</li>\n</ul>\n<p>Preferred Skills:</p>\n<ul>\n<li>Data platforms (Snowflake, Hex)</li>\n<li>Onboarding, implementation, identity, workflow automation</li>\n<li>Customer lifecycle products</li>\n<li>Technology venture founding/early employee experience</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6ac59b11-215","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8461597002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$240,000 CAD - $300,000 CAD","x-skills-required":["Software engineering experience","Technical leadership","AI-powered product experiences","Data-driven mindset","Cross-functional collaboration"],"x-skills-preferred":["Data platforms (Snowflake, Hex)","Onboarding, implementation, identity, workflow automation","Customer lifecycle products","Technology venture founding/early employee experience"],"datePosted":"2026-04-18T15:41:57.166Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Vancouver, British Columbia, Canada"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Software engineering experience, Technical leadership, AI-powered product experiences, Data-driven mindset, Cross-functional collaboration, Data platforms (Snowflake, Hex), Onboarding, implementation, identity, workflow automation, Customer lifecycle products, Technology venture founding/early employee experience","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":240000,"maxValue":300000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_cece3778-5b8"},"title":"Finance Systems Integration Engineer","description":"<p>We are seeking an experienced Finance Systems Integration Engineer to support our finance systems transformation at one of the fastest-growing AI companies. You&#39;ll design and build integrations connecting our ERP platform with critical financial applications and support our ERP implementation initiatives.</p>\n<p>As you master our integration landscape, you&#39;ll have opportunities to expand into Claude-powered AI automation and data pipeline development.</p>\n<p>You&#39;ll build the integration backbone for one of the fastest-growing AI companies, with a front-row seat to how Claude transforms financial operations. This is a foundational role where you&#39;ll shape our integration architecture from the ground up, then expand into cutting-edge AI automation as our needs evolve.</p>\n<p><strong>Responsibilities</strong></p>\n<p><strong>Core Focus: Integration Development &amp; ERP Support</strong></p>\n<ul>\n<li>Design, build, and maintain integrations connecting ERP systems with downstream applications including ZipHQ, Brex, Navan, Clearwater, Payroll systems, Salesforce, and other critical financial platforms using Workato, MuleSoft, or similar iPaaS solutions</li>\n</ul>\n<ul>\n<li>Support integration development and testing during the ERP implementation projects</li>\n</ul>\n<ul>\n<li>Develop and maintain REST APIs, webhooks, and OAuth 2.0 authentication flows for secure system-to-system communication</li>\n</ul>\n<ul>\n<li>Implement real-time and batch integration patterns supporting high-volume financial transactions</li>\n</ul>\n<ul>\n<li>Establish monitoring, alerting, and error-handling frameworks to ensure integration reliability and data integrity</li>\n</ul>\n<ul>\n<li>Document integration architectures, data flows, API specifications, and troubleshooting procedures</li>\n</ul>\n<ul>\n<li>Collaborate with implementation consulting partners and vendors on technical integration requirements</li>\n</ul>\n<p><strong>Additional Scope: AI Automation &amp; Data Infrastructure</strong></p>\n<ul>\n<li>Build and deploy Claude-powered AI agents that automate financial operations including intelligent document processing, workflow automation, financial audit and reconciliations, and self-service reporting</li>\n</ul>\n<ul>\n<li>Design agentic workflows that leverage Claude API capabilities integrated with ERP platform data and processes</li>\n</ul>\n<ul>\n<li>Create automated validation and quality assurance processes for AI-generated outputs</li>\n</ul>\n<ul>\n<li>Partner with Finance teams to identify automation opportunities and translate requirements into AI agent solutions</li>\n</ul>\n<ul>\n<li>Support data pipeline development using Airflow for workflow orchestration and dbt for data transformation</li>\n</ul>\n<ul>\n<li>Build and maintain data flows from ERP and other financial systems into BigQuery for analytics and reporting</li>\n</ul>\n<ul>\n<li>Implement data quality checks and testing frameworks for financial data pipelines</li>\n</ul>\n<ul>\n<li>Collaborate with Data Infrastructure team on pipeline architecture, performance optimization, and security monitoring</li>\n</ul>\n<ul>\n<li>Support executive dashboards and financial analytics by ensuring timely, accurate data delivery</li>\n</ul>\n<p><strong>Governance &amp; Collaboration</strong></p>\n<ul>\n<li>Maintain comprehensive documentation for integrations, AI agents, and data pipelines</li>\n</ul>\n<ul>\n<li>Support internal and external audits with technical evidence and system access reviews</li>\n</ul>\n<ul>\n<li>Collaborate with Finance Systems Engineers on operational support, troubleshooting, and enhancement requests</li>\n</ul>\n<ul>\n<li>Partner with Finance Operations, Accounting, FP&amp;A, Engineering, and Data Infrastructure teams to deliver holistic solutions</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>8+ years of experience in integration development, data engineering, or systems engineering roles</li>\n</ul>\n<ul>\n<li>Hands-on experience with iPaaS platforms such as Workato, MuleSoft, Dell Boomi, or similar integration tools</li>\n</ul>\n<ul>\n<li>Strong programming skills in Python and/or JavaScript/TypeScript for building custom integrations, APIs, and automation scripts</li>\n</ul>\n<ul>\n<li>Experience with data pipeline tools including Airflow for orchestration and dbt for transformation</li>\n</ul>\n<ul>\n<li>Working knowledge of cloud data platforms such as BigQuery, Snowflake, or Databricks</li>\n</ul>\n<ul>\n<li>Understanding of REST API design patterns, webhooks, OAuth 2.0, and modern integration architectures</li>\n</ul>\n<ul>\n<li>Familiarity with ERP systems (Oracle Fusion, Workday Financials, or similar) and financial business processes</li>\n</ul>\n<ul>\n<li>Strong problem-solving skills with ability to debug complex integration issues across multiple systems</li>\n</ul>\n<ul>\n<li>Excellent communication skills to collaborate with technical and business stakeholders</li>\n</ul>\n<p><strong>Preferred Qualifications</strong></p>\n<ul>\n<li>Experience with high-growth technology companies scaling through rapid revenue expansion (5x-10x growth)</li>\n</ul>\n<ul>\n<li>Background in AI/ML companies with familiarity in modern SaaS business models including consumption-based pricing, usage metering platforms, and marketplace billing</li>\n</ul>\n<ul>\n<li>Hands-on experience with specific platforms: Workday Financials (Workday Studio, EIB, custom reports, Prism Analytics)</li>\n</ul>\n<ul>\n<li>Technical expertise with modern finance tech stack including Stripe, Salesforce, Zuora RevPro, Zip Procurement, Clearwater treasury systems, Pigment planning tools, Numeric close management</li>\n</ul>\n<ul>\n<li>Programming skills in Python / JavaScript, or similar languages for building custom integrations, APIs, and automation scripts</li>\n</ul>\n<ul>\n<li>Experience with AI/LLM integration for financial operations, including document processing, data extraction, intelligent automation, and agentic workflows (familiarity with Claude models and API is a plus)</li>\n</ul>\n<ul>\n<li>Hands-on experience with modern data stack tools: BigQuery/Snowflake/Databricks, dbt for data transformation, Airflow for workflow orchestration</li>\n</ul>\n<ul>\n<li>Professional certifications such as Workato, Workday integrations, or relevant technical credentials</li>\n</ul>\n<ul>\n<li>Bachelor&#39;s or Master&#39;s degree in Computer Science, Information Systems, Accounting, Finance, Engineering, or related technical/business field</li>\n</ul>\n<ul>\n<li>Experience with business intelligence and financial reporting tools (Hex, Looker, Tableau, Power BI) for executive dashboards and financial analytics</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_cece3778-5b8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://anthropic.com","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5155195008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$205,000-$265,000 USD","x-skills-required":["integration development","data engineering","systems engineering","iPaaS platforms","Python","JavaScript/TypeScript","Airflow","dbt","BigQuery","Snowflake","Databricks","REST API design patterns","webhooks","OAuth 2.0","modern integration architectures","ERP systems","financial business processes"],"x-skills-preferred":["high-growth technology companies","AI/ML companies","SaaS business models","consumption-based pricing","usage metering platforms","marketplace billing","Workday Financials","Stripe","Salesforce","Zuora RevPro","Zip Procurement","Clearwater treasury systems","Pigment planning tools","Numeric close management","Python/JavaScript","AI/LLM integration","document processing","data extraction","intelligent automation","agentic workflows","Claude models","API","BigQuery/Snowflake/Databricks","professional certifications","Workato","Workday integrations","technical credentials","Computer Science","Information Systems","Accounting","Finance","Engineering","business intelligence","financial reporting tools"],"datePosted":"2026-04-18T15:39:50.764Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA | Seattle, WA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"integration development, data engineering, systems engineering, iPaaS platforms, Python, JavaScript/TypeScript, Airflow, dbt, BigQuery, Snowflake, Databricks, REST API design patterns, webhooks, OAuth 2.0, modern integration architectures, ERP systems, financial business processes, high-growth technology companies, AI/ML companies, SaaS business models, consumption-based pricing, usage metering platforms, marketplace billing, Workday Financials, Stripe, Salesforce, Zuora RevPro, Zip Procurement, Clearwater treasury systems, Pigment planning tools, Numeric close management, Python/JavaScript, AI/LLM integration, document processing, data extraction, intelligent automation, agentic workflows, Claude models, API, BigQuery/Snowflake/Databricks, professional certifications, Workato, Workday integrations, technical credentials, Computer Science, Information Systems, Accounting, Finance, Engineering, business intelligence, financial reporting tools","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":205000,"maxValue":265000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_facf5d80-7bd"},"title":"Solutions Engineer, Delivery & Automation","description":"<p>We&#39;re looking for a Solutions Engineer who gets energized by solving gnarly technical problems and making customers wildly successful. As the technical quarterback for new customer onboardings, you&#39;ll translate their vision into working integrations, navigate the chaos of healthcare data standards, and ensure they extract real value from day one.</p>\n<p>Key responsibilities:</p>\n<p>Own the technical journey - Lead end-to-end onboarding for new customers,from authentication setup to data mart configuration</p>\n<p>Integrate customer systems with Zus (APIs, SFTP, HL7, FHIR,the whole interoperability stack)</p>\n<p>Translate messy business requirements into clean technical architectures</p>\n<p>Build and maintain automated workflows that make implementations faster and more reliable</p>\n<p>Drive customer success through technical excellence - Be the trusted technical advisor customers call when things get complicated</p>\n<p>Run technical deep dives and implementation reviews that actually move the needle</p>\n<p>Identify integration risks before they become blockers and solve them proactively</p>\n<p>Train customers on best practices so they become power users, not support tickets</p>\n<p>Innovate on process - Use AI tools (LLMs, automation platforms, scripting) to eliminate manual work and scale your impact</p>\n<p>Build templates, scripts, and tooling that make the 10th implementation faster than the 1st</p>\n<p>Document learnings and create repeatable playbooks through automation that make the whole team better</p>\n<p>Collaborate with R&amp;D - Partner closely with Product and Engineering to surface integration challenges and opportunities for platform improvement</p>\n<p>Translate real-world customer integration patterns into product feedback and roadmap insights</p>\n<p>Collaborate with R&amp;D teams on emerging capabilities around AI, data pipelines, and developer tooling</p>\n<p>Act as the voice of the customer when identifying opportunities to improve developer experience and reduce integration friction</p>\n<p>You&#39;ll enjoy solving messy integration challenges, building automation that eliminates manual work, and partnering closely with Product and Engineering to continuously improve the platform.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_facf5d80-7bd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Zus","sameAs":"https://zus.com/","logo":"https://logos.yubhub.co/zus.com.png"},"x-apply-url":"https://jobs.lever.co/zushealth/fbe45c72-4269-4c7f-b88c-6df3349c2479","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$125,000-165,000 per year","x-skills-required":["healthcare data standards (FHIR, HL7, CCD)","major EMRs (Epic, Cerner, athenahealth)","API and data pipeline experience (ETL, REST APIs, JSON, CSV ingestion)","data platforms (Snowflake, SQL databases) including schema design and query optimization","Python scripting skills and SQL fluency","secure environments and compliance (HIPAA, SOC2)"],"x-skills-preferred":["AI tools (LLMs, automation platforms, scripting)","data pipelines","developer tooling"],"datePosted":"2026-04-17T13:12:29.884Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"healthcare data standards (FHIR, HL7, CCD), major EMRs (Epic, Cerner, athenahealth), API and data pipeline experience (ETL, REST APIs, JSON, CSV ingestion), data platforms (Snowflake, SQL databases) including schema design and query optimization, Python scripting skills and SQL fluency, secure environments and compliance (HIPAA, SOC2), AI tools (LLMs, automation platforms, scripting), data pipelines, developer tooling","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":125000,"maxValue":165000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_bd05f3e3-531"},"title":"Data/Analytics Engineer","description":"<p>About Mistral AI TAGline Removed\nWe are seeking passionate and talented Data/Analytics Engineers to join our team.</p>\n<p>In this role, you will have the unique opportunity to build, optimize, and maintain our data infrastructure. You will work with large volumes of data, enabling product teams to access secure and reliable data quickly. Your contributions will support our science team in enhancing the quality of our state-of-the-art AI models and help business users make informed decisions.</p>\n<p>Responsibilities</p>\n<p>• Design, build, and maintain scalable data pipelines, ETL processes, and analytics infrastructure. Automate data quality checks and validation processes.\n• Collaborate with cross-functional teams to understand data needs and deliver high-quality, actionable solutions, eg work closely with machine learning teams to support model training, deployment pipelines, and feature stores.\n• Optimize data storage, retrieval, processing, and queries for performance, scalability, and cost-efficiency.\n• Define and enforce data governance, metadata management, and data lineage standards.\n• Ensure data integrity, security, and compliance with industry standards.</p>\n<p>About You</p>\n<p>• Master’s degree in Computer Science, Engineering, Statistics, or a related field.\n• 3+ years of experience in data engineering, analytics engineering, or a related role.\n• Proficiency in Python and SQL.\n• Experience with dbt.\n• Experience with cloud platforms (e.g., AWS, GCP, Azure) and data warehousing solutions (e.g., Snowflake, BigQuery, Redshift, Clickhouse).\n• Strong analytical and problem-solving skills, with attention to detail.\n• Ability to communicate complex data concepts to both technical and non-technical stakeholders.</p>\n<p>Nice to Have</p>\n<p>• Experience with machine learning pipelines, MLOps, and feature engineering.\n• Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes).\n• Familiarity with DevOps practices, CI/CD pipelines, and infrastructure-as-code (e.g., Terraform).\n• Background in building self-service data platforms for analytics and AI use cases.</p>\n<p>Hiring Process</p>\n<p>• Intro call with Recruiter - 30 min\n• Hiring Manager Interview - 30 min\n• Technical interview - Live Coding (Python/SQL) - 45 min\n• Technical interview - System Design - 45 min\n• Value talk interview - 30 mins\n• References</p>\n<p>Additional Information</p>\n<p>Location &amp; Remote</p>\n<p>The position is based in our Paris HQ offices and we encourage going to the office as much as we can (at least 3 days per week) to create bonds and smooth communication. Our remote policy aims to provide flexibility, improve work-life balance and increase productivity. Each manager can decide the amount of days worked remotely based on autonomy and a specific context (e.g. more flexibility can occur during summer). In any case, employees are expected to maintain regular communication with their teams and be available during core working hours.</p>\n<p>What We Offer</p>\n<p>💰 Competitive salary and equity package\n🧑‍⚕️ Health insurance\n🚴 Transportation allowance\n🥎 Sport allowance\n🥕 Meal vouchers\n💰 Private pension plan\n🍼 Generous parental leave policy</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_bd05f3e3-531","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mistral AI","sameAs":"https://mistral.ai","logo":"https://logos.yubhub.co/mistral.ai.png"},"x-apply-url":"https://jobs.lever.co/mistral/6f28da96-76f9-44bb-9b85-4e3519fde6d4","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","SQL","dbt","AWS","GCP","Azure","Snowflake","BigQuery","Redshift","Clickhouse"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:47:21.092Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Paris"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, SQL, dbt, AWS, GCP, Azure, Snowflake, BigQuery, Redshift, Clickhouse"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e503559e-cf7"},"title":"Senior Machine Learning Engineer","description":"<p><strong>Job Title: Senior Machine Learning Engineer</strong></p>\n<p><strong>Job Description:</strong></p>\n<p>Before 1965, it was extremely difficult and time-consuming to analyze complicated signals, like radio or images. You could solve it, but you had to throw a ton of compute at it. That all changed with the invention of the Fast Fourier transform, which could efficiently break that signal down into the frequencies that are a part of it.</p>\n<p>The Risk Onboarding team is working on efficiently reviewing customers’ applications without compromising on quality. We are the front line of defense for preventing money laundering and financial crimes, building systems to verify that someone is who they say they are and that we are allowed to do business with them.</p>\n<p><strong>About Us:</strong></p>\n<p>At Mercury, we craft an exceptional banking experience for startups. Our team is focused on ensuring our products create a safe environment that meets the needs of our customers, administrators, and regulators.</p>\n<p><strong>Job Responsibilities:</strong></p>\n<p>As part of this role, you will:</p>\n<ul>\n<li>Partner with data science &amp; engineering teams to design and deploy ML &amp; Gen AI microservices, primarily focusing on automating reviews</li>\n<li>Work with a full-stack engineering team to embed these services into the overall review experience, including human in the loop, escalations, and feeding human decisions back into the service</li>\n<li>Implement testing, observability, alerting, and disaster recovery for all services</li>\n<li>Implement tracing, performance, and regression testing</li>\n<li>Feel a strong sense of product ownership and actively seek responsibility – we often self-organize on small/medium projects, and we want someone who’s excited to help shape and build Mercury’s future</li>\n</ul>\n<p><strong>Ideal Candidate:</strong></p>\n<p>The ideal candidate for the role has:</p>\n<ul>\n<li>7+ years of experience in roles like machine learning engineering, data engineering, backend software engineering, and/or devops</li>\n<li>Expertise with:</li>\n</ul>\n<ul>\n<li>A full modern data stack: Snowflake, dbt, Fivetran, Airbyte, Dagster, Airflow</li>\n<li>SQL, dbt, Python</li>\n<li>OLAP / OLTP data modelling and architecture</li>\n<li>Key-value stores: Redis, dynamoDB, or equivalent</li>\n<li>Streaming / real-time data pipelines: Kinesis, Kafka, Redpanda</li>\n<li>API frameworks: FastAPI, Flask, etc.</li>\n<li>Production ML Service experience</li>\n<li>Working across full-stack development environment, with experience transferable to Haskell, React, and TypeScript</li>\n</ul>\n<p><strong>Total Rewards Package:</strong></p>\n<p>The total rewards package at Mercury includes base salary, equity (stock options/RSUs), and benefits. Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry. New hire offers are made based on a candidate’s experience, expertise, geographic location, and internal pay equity relative to peers.</p>\n<p><strong>Salary Range:</strong></p>\n<p>Our target new hire base salary ranges for this role are the following:</p>\n<ul>\n<li>US employees (any location): $200,700 - $250,900</li>\n<li>Canadian employees (any location): CAD 189,700 - 237,100</li>\n</ul>\n<p><strong>Diversity &amp; Belonging:</strong></p>\n<p>Mercury values diversity &amp; belonging and is proud to be an Equal Employment Opportunity employer. All individuals seeking employment at Mercury are considered without regard to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, sexual orientation, or any other legally protected characteristic.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e503559e-cf7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Mercury","sameAs":"https://www.mercury.com/","logo":"https://logos.yubhub.co/mercury.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/mercury/jobs/5639559004","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$200,700 - $250,900 (US) | CAD 189,700 - 237,100 (Canada)","x-skills-required":["Snowflake","dbt","Fivetran","Airbyte","Dagster","Airflow","SQL","Python","OLAP / OLTP data modelling and architecture","Redis","dynamoDB","Kinesis","Kafka","Redpanda","FastAPI","Flask","Production ML Service experience","Haskell","React","TypeScript"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:45:16.566Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Snowflake, dbt, Fivetran, Airbyte, Dagster, Airflow, SQL, Python, OLAP / OLTP data modelling and architecture, Redis, dynamoDB, Kinesis, Kafka, Redpanda, FastAPI, Flask, Production ML Service experience, Haskell, React, TypeScript","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":189700,"maxValue":250900,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3048ccd4-7de"},"title":"Data Analyst","description":"<p>We are seeking a Data Analyst to join our growing data team. As a Data Analyst at LayerZero, you will be at the forefront of shaping a rich data foundation for a company making a real impact in the web3 space. You will work closely with teams and leaders to uncover insights, drive decision-making, and fuel our next-generation products and services.</p>\n<p>The successful candidate will dive headfirst into the world of crypto data, exploring on-chain wallets and contracts, block and transaction data, insights from in-house systems, and third-party intelligence. Your mission will be to combine these diverse datasets into rich, actionable data products for a broad group of stakeholders.</p>\n<p>Key responsibilities include:\nLeveraging and expanding our ever-growing Kimball dimensional model.\nWriting SQL to create and expand insights in our in-house reporting solutions.\nCollaborating with stakeholders across the organization to conduct ad-hoc explorations and analytics.\nBeing a key owner of data quality, building out insights that serve the data team itself.\nComposing pipelines by writing SQL code to clean, combine, refine, and aggregate data into the insights the organization needs.\nCollaborating on new datasets to ingest into our Snowflake data warehouse, working closely with data engineers on your team.\nNot afraid of pushing code that supports tens of billions of dollars in daily transaction volume.</p>\n<p>We are looking for someone with previous data analyst experience, likely with a bachelor&#39;s degree in Computer Science, Statistics, Mathematics, Physics or related field, but we also consider and highly value equivalent practical experience.</p>\n<p>Required skills include strong SQL knowledge and experience, proven track record in data modeling, statistics, and analytics, experience working with a broad range of stakeholders, and strong convictions weakly held.\nNice to have skills include experience with general programming, experience with Snowflake, experience building DAG-based data pipelines, experience with streaming real-time data pipelines, previous experience with blockchain technologies, smart contracts, and decentralized finance, experience with Kimball dimensional modeling, and working on a mid-to-large scale data stacks.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3048ccd4-7de","directApply":true,"hiringOrganization":{"@type":"Organization","name":"LayerZero","sameAs":"https://layerzero.com/","logo":"https://logos.yubhub.co/layerzero.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/layerzerolabs/jobs/5787956004","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","statistics","analytics","Snowflake","Kimball dimensional modeling"],"x-skills-preferred":["general programming","DAG-based data pipelines","streaming real-time data pipelines","blockchain technologies","smart contracts","decentralized finance"],"datePosted":"2026-04-17T12:41:37.110Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Vancouver, BC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, statistics, analytics, Snowflake, Kimball dimensional modeling, general programming, DAG-based data pipelines, streaming real-time data pipelines, blockchain technologies, smart contracts, decentralized finance"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7619176a-424"},"title":"Forward Deployed Engineer","description":"<p>You will spend the majority of your time embedded with Hebbia&#39;s most strategic customers, building the last mile of our platform for their specific workflows, data, and domain. This is a hands-on engineering role. You write production code, you ship it, you own it.</p>\n<p>As a Forward Deployed Engineer, you are the bridge between Hebbia&#39;s platform and the real-world complexity of our customers&#39; environments. You sit with the customer&#39;s team, understand their hardest problems, and build solutions that make Hebbia indispensable. Then you bring what you&#39;ve learned back to our engineering and product teams to make the platform better for everyone.</p>\n<p>This role is for engineers who want to combine deep technical work with direct customer impact. You will see your code create value in days, not months. The FDE team operates at the intersection of engineering and go-to-market. You will work closely with our core engineering team,shared code review, architecture alignment, deploy pipelines,and with our account teams who direct where you deploy and what you focus on. Our team works in person 5 days a week at our offices in NYC and SF.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Embed with strategic accounts to deeply understand their domain, data, and workflows</li>\n<li>Build custom integrations, workflow automations, and domain-specific solutions on top of Hebbia&#39;s platform</li>\n<li>Write production code that deploys through our CI/CD pipelines and meets our engineering standards</li>\n<li>Own the technical relationship with the customer&#39;s team during your engagement</li>\n<li>Prototype fast, validate with the customer, iterate, and ship</li>\n<li>Return from engagements and work with engineering and product to generalize reusable patterns into platform capabilities</li>\n<li>Participate in code review, on-call rotation, and architecture discussions alongside core engineering</li>\n<li>Build connectors to customer data sources and document management systems</li>\n</ul>\n<p>Who You Are:</p>\n<ul>\n<li>5+ years software development experience at a venture-backed startup or top technology firm</li>\n<li>Strong full-stack engineering skills. You build across the stack: APIs, data pipelines, frontend when needed, infrastructure when needed.</li>\n<li>Comfortable working in ambiguity. Customer problems are messy and underspecified. You figure it out.</li>\n<li>High customer empathy. You enjoy sitting with users, understanding their workflows, and translating pain points into technical solutions.</li>\n<li>Fast and pragmatic. You prototype, validate, and ship in days and weeks, not quarters.</li>\n<li>Strong communicator. You are the primary technical point of contact for the customer. You can talk to both engineers and executives.</li>\n<li>Experience with cloud platforms (e.g., AWS) and modern backend technologies (Python, TypeScript, Go)</li>\n<li>Experience with data integrations, ETL pipelines, or enterprise data systems (S3, Snowflake, SharePoint, etc.) is a plus</li>\n<li>Experience with LLMs, RAG systems, or applied AI is a plus but not required</li>\n<li>Prior experience in finance, legal, or consulting domains is a plus</li>\n<li>Experience with customer-facing engineering roles (solutions engineering, professional services, or similar) is a plus</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7619176a-424","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Hebbia","sameAs":"https://hebbia.com","logo":"https://logos.yubhub.co/hebbia.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/hebbia/jobs/4679338005","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$180,000 to $300,000","x-skills-required":["Full-stack engineering","Cloud platforms (e.g., AWS)","Modern backend technologies (Python, TypeScript, Go)","Data integrations, ETL pipelines, or enterprise data systems (S3, Snowflake, SharePoint, etc.)","Customer-facing engineering roles (solutions engineering, professional services, or similar)"],"x-skills-preferred":["LLMs, RAG systems, or applied AI","Finance, legal, or consulting domains"],"datePosted":"2026-04-17T12:37:49.316Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City; San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Full-stack engineering, Cloud platforms (e.g., AWS), Modern backend technologies (Python, TypeScript, Go), Data integrations, ETL pipelines, or enterprise data systems (S3, Snowflake, SharePoint, etc.), Customer-facing engineering roles (solutions engineering, professional services, or similar), LLMs, RAG systems, or applied AI, Finance, legal, or consulting domains","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":300000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8a8c0eb9-6e6"},"title":"Data Scientist, Product","description":"<p><strong>Job Title: Data Scientist, Product</strong></p>\n<p>This is the founding hire for product analytics at Hebbia. As a data scientist, you will define what our core product metrics are: what counts as an active user, what engagement actually means, what signals correlate with retention.</p>\n<p>This is not a dashboarding role. The goal is to shape product decisions with data, not just report on them. You will identify which workflows drive repeat usage, where users drop off, what features move engagement, and what differentiates power users from casual users across our enterprise customer base.</p>\n<p>The role sits at the intersection of analytics engineering, product analytics, and data science. You will build the infrastructure and do the analysis. Define the metrics, build the pipelines, create the dashboards, and use what you built to inform the roadmap.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Define and implement Hebbia&#39;s core product metrics from scratch: active users, engagement, retention, feature adoption, account health. Build the canonical definitions the entire company uses.</li>\n<li>Design and build the product analytics infrastructure: fact tables, clean data models, and the analytics layer that sits on top of our product data.</li>\n<li>Build and maintain executive and product dashboards that leadership and product teams use to make decisions.</li>\n<li>Write DAGs, transforms, and data pipelines that support analytics. Work with engineering to instrument the product so usage data is captured correctly.</li>\n<li>Analyze customer behavior across our B2B customer base: account-level usage patterns, workflow adoption, expansion signals, and churn risk indicators.</li>\n<li>Inform the product roadmap using data. Identify friction in user flows, surface feature adoption patterns, and highlight opportunities for product improvement.</li>\n<li>Partner with product managers and engineers to translate product questions into measurable data and structured experiments.</li>\n<li>Establish data quality standards and documentation so the metrics layer you build is trusted and maintained.</li>\n</ul>\n<p><strong>Who You Are</strong></p>\n<ul>\n<li>3+ years of experience in product analytics, analytics engineering, or data science at a B2B SaaS company or high-growth startup</li>\n<li>Strong in SQL and Python. You can write production-quality transforms, not just ad hoc queries.</li>\n<li>Experience with modern data stack tools: dbt, Airflow, Snowflake, BigQuery, or similar. You understand data modeling and warehouse architecture.</li>\n<li>You have built dashboards and reporting that product teams and leadership actually use to make decisions</li>\n<li>You understand B2B product analytics: account-level metrics, multi-user workflows, enterprise engagement patterns, and why B2B retention analysis is different from consumer</li>\n<li>You translate ambiguous product questions into structured analyses. You do not wait for someone to hand you a spec.</li>\n<li>Strong product intuition. You care about why users behave the way they do, not just what the numbers say.</li>\n<li>Clear communicator. You can present findings to engineers, product managers, and executives with equal effectiveness.</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>The salary range for this position is set between $180,000 to $260,000. This range may be inclusive of several career levels at Hebbia and will be narrowed during the interview process based on the candidate’s experience and qualifications.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8a8c0eb9-6e6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Hebbia","sameAs":"https://hebbia.com/","logo":"https://logos.yubhub.co/hebbia.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/hebbia/jobs/4670090005","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$180,000 - $260,000","x-skills-required":["SQL","Python","dbt","Airflow","Snowflake","BigQuery","data modeling","warehouse architecture","product analytics","analytics engineering","data science"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:37:39.339Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City; San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, Python, dbt, Airflow, Snowflake, BigQuery, data modeling, warehouse architecture, product analytics, analytics engineering, data science","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":260000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4475ebe1-e8a"},"title":"Data Engineering Intern","description":"<p>We&#39;re seeking a motivated and curious Data Engineering Intern to join our Data Platform team. This internship offers a unique opportunity to gain hands-on experience building and maintaining real data infrastructure within a fast-growing fintech environment.</p>\n<p>As a Data Engineering Intern, you&#39;ll collaborate on thoughtful projects and bring your fresh perspectives to impact our product and families. You&#39;ll assist in building and maintaining data pipelines using Airflow to orchestrate workflows that ingest, transform, and deliver data into Snowflake and Databricks.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Assist in building and maintaining data pipelines using Airflow to orchestrate workflows that ingest, transform, and deliver data into Snowflake and Databricks</li>\n<li>Support the design and implementation of data models in Snowflake that serve analytics, reporting, and ML use cases</li>\n<li>Help develop and maintain transformation logic using dbt, including writing models, tests, and documentation</li>\n<li>Contribute to data quality checks and validation processes to ensure accuracy, completeness, and timeliness of data</li>\n<li>Assist with infrastructure automation using Terraform to manage cloud resources in AWS</li>\n<li>Participate in troubleshooting data pipeline issues and investigating root causes alongside senior engineers</li>\n<li>Collaborate with data analysts, analytics engineers, and business stakeholders to understand requirements and contribute to technical solutions</li>\n<li>Help create and maintain documentation for data pipelines, data models, and infrastructure processes</li>\n<li>Participate in code reviews to develop best practices and learn from the team</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>A 3.0 GPA or higher</li>\n<li>Currently pursuing a Bachelor&#39;s or Master&#39;s degree in Computer Science, Data Science, Information Technology, or a related field</li>\n<li>Basic understanding of SQL and comfort manipulating data</li>\n<li>Interest in data engineering, data infrastructure, and/or analytics engineering</li>\n<li>Familiarity with Python for scripting, data processing, or automation (preferred)</li>\n<li>Basic understanding of cloud platforms, particularly AWS, is a plus</li>\n<li>Strong analytical thinking and problem-solving skills , especially comfort working through ambiguity</li>\n<li>Good communication and collaboration skills; able to work cross-functionally with technical and non-technical teammates</li>\n<li>Eagerness to take ownership of your work and ask thoughtful questions</li>\n</ul>\n<p>Learning Opportunities:</p>\n<ul>\n<li><p>You&#39;ll have the opportunity to gain experience with technologies including:</p>\n<ul>\n<li>Snowflake</li>\n<li>dbt (data build tool)</li>\n<li>Apache Airflow</li>\n<li>AWS (S3, Lambda, EC2, IAM)</li>\n<li>Databricks</li>\n<li>Terraform</li>\n<li>Fivetran</li>\n<li>Segment</li>\n</ul>\n</li>\n</ul>\n<p>This internship provides an excellent foundation for a career in data engineering, analytics engineering, or data architecture within the fintech industry.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4475ebe1-e8a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Greenlight","sameAs":"https://www.greenlight.com/","logo":"https://logos.yubhub.co/greenlight.com.png"},"x-apply-url":"https://jobs.lever.co/greenlight/b5d9d9b2-9d06-4db7-932c-30fd4a43825d","x-work-arrangement":"hybrid","x-experience-level":"intern","x-job-type":"internship","x-salary-range":null,"x-skills-required":["SQL","Python","Airflow","Snowflake","dbt","AWS","Databricks","Terraform","Fivetran","Segment"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:36:54.798Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Atlanta"}},"employmentType":"INTERN","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, Python, Airflow, Snowflake, dbt, AWS, Databricks, Terraform, Fivetran, Segment"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_58df2f04-af4"},"title":"Data Engineer","description":"<p>We are looking for a Data Engineer to join our Data Platform team to partner with our product and business stakeholders across risk, operations, and other domains. As a Data Engineer, you will be responsible for building robust data pipelines and engineering foundations by ingesting data from disparate sources, ensuring data quality and consistency, and enabling better business decisions through reliable data infrastructure across core product areas.</p>\n<p>Your primary focus will be on building scalable data pipelines using Airflow to orchestrate data workflows that ingest, transform, and deliver data from various sources into Snowflake and Databricks. You will also design and implement data models in Snowflake that support analytics, reporting, and ML use cases with a focus on performance, reliability, and scalability.</p>\n<p>In addition, you will develop infrastructure as code using Terraform to automate and manage cloud resources in AWS, ensuring consistent and reproducible deployments. You will monitor data pipeline health and implement data quality checks to ensure accuracy, completeness, and timeliness of data as business needs evolve.</p>\n<p>You will also optimize data processing workflows to improve performance, reduce costs, and handle growing data volumes efficiently. Troubleshooting and resolving data pipeline issues, working through ambiguity to get to the root cause and implementing long-term fixes will be a key part of your role.</p>\n<p>As a Data Engineer, you will bridge gaps between data and the business by working with cross-functional teams across the US and India office to understand requirements and translate them into robust technical solutions. You will create comprehensive documentation on data pipelines, data models, and infrastructure, keeping documentation up to date and facilitating knowledge transfer across the team.</p>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>2+ years of data engineering experience with strong technical skills and the ability to architect scalable data solutions.</li>\n</ul>\n<ul>\n<li>Hands-on experience with Python for data processing, automation, and building data pipelines.</li>\n</ul>\n<ul>\n<li>Proficiency with workflow orchestration tools, preferably Airflow, including DAG development, task dependencies, and monitoring.</li>\n</ul>\n<ul>\n<li>Strong SQL skills and experience with cloud data warehouses like Snowflake, including performance optimization and data modeling.</li>\n</ul>\n<ul>\n<li>Experience with cloud platforms, preferably AWS (S3, Lambda, EC2, IAM, etc.), and understanding of cloud-based data architectures.</li>\n</ul>\n<ul>\n<li>Experience working cross-functionally with data analysts, analytics engineers, data scientists, and business stakeholders to understand requirements and deliver solutions.</li>\n</ul>\n<ul>\n<li>An ownership mentality – this engineer will be responsible for the reliability and performance of their data pipelines and expected to fully understand data flows, dependencies, and their implications on downstream users.</li>\n</ul>\n<p><strong>Nice to have:</strong></p>\n<ul>\n<li>Experience with dbt for transformation logic and analytics engineering workflows integrated with data pipelines.</li>\n</ul>\n<ul>\n<li>Familiarity with Databricks for large-scale data processing, including Spark optimization and Delta Lake.</li>\n</ul>\n<ul>\n<li>Experience with Infrastructure as Code (IaC) tools like Terraform for managing cloud resources and data infrastructure.</li>\n</ul>\n<ul>\n<li>Knowledge of data modeling concepts (e.g., dimensional modeling, star/snowflake schemas, slowly changing dimensions).</li>\n</ul>\n<ul>\n<li>Experience with CI/CD practices for data pipelines and automated testing frameworks.</li>\n</ul>\n<ul>\n<li>Experience with streaming data and real-time processing frameworks</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_58df2f04-af4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Greenlight","sameAs":"https://www.greenlight.com/","logo":"https://logos.yubhub.co/greenlight.com.png"},"x-apply-url":"https://jobs.lever.co/greenlight/e98d9733-8b8c-4ce4-997d-6cf14e35b2f3","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Airflow","Python","SQL","Snowflake","Databricks","AWS","Terraform","data engineering","data pipelines","data modeling"],"x-skills-preferred":["dbt","Infrastructure as Code","CI/CD","streaming data","real-time processing"],"datePosted":"2026-04-17T12:36:30.660Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Airflow, Python, SQL, Snowflake, Databricks, AWS, Terraform, data engineering, data pipelines, data modeling, dbt, Infrastructure as Code, CI/CD, streaming data, real-time processing"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_87c01508-dd0"},"title":"Data Analyst","description":"<p>We are looking for a Data Analyst to join our Marketing Science team. As a Data Analyst, you will be the analytical lead across growth marketing, registration, and website analytics, setting the measurement framework, driving analysis independently, and translating results into decisions stakeholders can act on.</p>\n<p>Your day-to-day will involve owning analytical coverage across growth marketing, registration, and website, setting measurement frameworks, building scalable reporting, and driving the analytical roadmap for your domain. You will lead analysis on complex, ambiguous questions: incrementality testing, LTV and payback modeling, attribution deep dives, funnel decomposition, and cohort analysis.</p>\n<p>You will build and own dbt models that encode business logic for your domain in a scalable, reusable way to power dashboards, reporting, and stakeholder self-serve. You will build and maintain Tableau dashboards that give marketing and leadership a clear, reliable view of performance across growth, registration, and web.</p>\n<p>You will partner directly with marketing stakeholders to scope analytical problems, pressure-test interpretations, and make sure findings land as decisions, not just slide decks. You will use Claude fluently as a daily productivity layer, accelerating SQL, generating documentation, QA-ing work, and building faster analytical workflows.</p>\n<p>You will mentor and raise the floor for junior analysts on the team. You will review their work, help them think through ambiguous problems, and set the standard for analytical quality. You will identify gaps before they become problems, from data quality and metric definitions to reporting coverage and stakeholder understanding. You will own the fix, not just the flag.</p>\n<p>We are looking for someone with 4–8 years of experience in a data or analytics role, with hands-on experience in at least one marketing channel: paid search (preferred), paid social, TV/CTV, affiliates, direct mail, or similar. You should understand how that channel&#39;s data works, how it&#39;s structured, what the key metrics are, and how performance is measured and reported. Multi-channel experience is a strong plus.</p>\n<p>You should have strong, proven SQL ability. You should write clean, efficient queries across complex schemas and build analytics logic others can maintain. Dbt experience, including owning models in production, is a strong plus. You should be proficient with Claude or similar AI tools for analytical work. You should use them to move faster, not as a crutch. You should know how to validate outputs rather than ship the first answer.</p>\n<p>You should write readable, well-structured Python for analytical work: data manipulation, automation, or scripting. Your code should be something a teammate can pick up and maintain. You should understand how attribution models work, what makes a good experiment, and how to think about LTV, CAC, and payback period without needing a primer.</p>\n<p>You should be proactive and not wait to be asked. You should spot the gap, scope the work, and drive it to a conclusion. Your stakeholders shouldn&#39;t have to manage your backlog. You should be able to take a fuzzy business question and turn it into a clear analytical plan. You should ask the right clarifying questions and bring structure without needing the problem handed to you pre-scoped.</p>\n<p>You should communicate findings in a way that drives decisions. You should know when a chart is enough and when you need a narrative. You&#39;ve presented to senior stakeholders and know how to calibrate.</p>\n<p>Preferred experience includes breadth across multiple marketing channels: paid search, paid social, TV/CTV, affiliates, or others. Hands-on experience with experiment design and incrementality testing is also preferred. Experience with Snowflake or a similar cloud data warehouse is a plus. Experience mentoring or reviewing the work of junior analysts is also a plus.</p>\n<p>Work perks at Greenlight include medical, dental, vision, and HSA match, paid life insurance, AD&amp;D, and disability benefits, traditional 401k with company match, unlimited PTO, paid company holidays and pop-up bonus holidays, professional development stipends, mental health resources, 1:1 financial planners, fertility healthcare, 100% paid parental and caregiving leave, plus cleaning service and meals during your leave, flexible WFH, both remote and in-office opportunities, fully stocked kitchen, catered lunches, and occasional in-office happy hours, employee resource groups.</p>\n<p>Our stance on salaries is that Greenlight provides a competitive compensation package with a market-based approach to pay and will vary depending on your location, experience, and skill set. The total compensation package for this position will also include a discretionary performance bonus, equity rewards, medical benefits, 401K match, and more. Greenlight conducts continuous compensation evaluations across departments and geographies to ensure we are keeping our pay current and competitive.</p>\n<p>The estimated base pay range for this position in NY, CA, WA is $125,000-145,000. The estimated base pay range for this position in CO is $125,000-135,000.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_87c01508-dd0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Greenlight","sameAs":"https://www.greenlight.com/","logo":"https://logos.yubhub.co/greenlight.com.png"},"x-apply-url":"https://jobs.lever.co/greenlight/084768a3-1c27-4390-b94c-d987f636811a","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$125,000-145,000 (NY, CA, WA); $125,000-135,000 (CO)","x-skills-required":["SQL","dbt","Tableau","Claude","Python","data manipulation","automation","scripting","attribution models","experiment design","incrementality testing"],"x-skills-preferred":["paid search","paid social","TV/CTV","affiliates","direct mail","Snowflake","cloud data warehouse","mentoring","reviewing junior analyst work"],"datePosted":"2026-04-17T12:35:56.953Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Atlanta"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, dbt, Tableau, Claude, Python, data manipulation, automation, scripting, attribution models, experiment design, incrementality testing, paid search, paid social, TV/CTV, affiliates, direct mail, Snowflake, cloud data warehouse, mentoring, reviewing junior analyst work","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":125000,"maxValue":145000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5242ca9a-088"},"title":"Staff Automation Engineer","description":"<p>We are looking for a Staff Automation Engineer to have a huge impact on the Business Systems, Security, Production Engineering and IT functions. This role is for a seasoned engineer who thrives on solving complex operational challenges, enhancing system security and stability, and improving efficiency through automation and best practices using AI technologies.</p>\n<p>Your day-to-day will involve implementing Agentic AI and LLM-powered workflows using tools like Tines, AWS Agentcore, AWS Bedrock, Claude Code, etc. You will deploy systems with Infrastructure as Code (IaC) (i.e. Terraform) and build and maintain automation workflows across key enterprise platforms (i.e. Atlassian, Okta, Google Workspace, Slack, Zoom, knowledge management systems), cybersecurity systems (i.e. SIEM, GRC platforms, Data Security Platforms, etc.), and cloud environments (AWS, GCP).</p>\n<p>You will build AI-driven chatbots or intelligent agents that automate tasks, support conversational workflows, and integrate with enterprise applications. You will partner with IT, Security, GRC, Procurement, and business teams to automate operational tasks and processes to reduce toil, improve efficiency and enable business.</p>\n<p>You will develop integrations using REST APIs, JSON, webhooks, and scripting languages (JavaScript, Python). You will follow established automation and AI standards for quality, security, and governance; provide improvements where appropriate.</p>\n<p>You will troubleshoot, maintain, and optimize existing workflows to improve stability and performance. You will document designs, workflows, configurations, and operational procedures.</p>\n<p>You will participate in code reviews, technical discussions, and team-based learning to uplift engineering quality and consistency.</p>\n<p>You will work with various tooling in Security, IT, and Production Engineering.</p>\n<p>This role requires 10+ years of experience in automation engineering, systems integration, or workflow development. You should have experience with automation platforms such as Tines, Retool, Superblocks, n8n, etc. You should also have hands-on experience with Terraform and containerization technologies.</p>\n<p>You should have experience developing LLM-powered automations, conversational interfaces, or Agentic AI assistants. You should have knowledge of Git and modern version control practices.</p>\n<p>You should have strong skills in REST APIs, JSON, webhooks, JavaScript, and Python. You should also have familiarity with identity systems (Okta, SCIM) and RBAC concepts.</p>\n<p>You should have familiarity with cloud environments such as Google Cloud Platform (GCP) and Amazon Web Services (AWS).</p>\n<p>You should be able to break down problems, collaborate cross-functionally, and deliver solutions with moderate guidance.</p>\n<p>You should have strong communication skills and the ability to translate functional requirements into technical outputs.</p>\n<p>Preferred experience includes familiarity with data platform and database technologies (e.g., Snowflake, PostgreSQL, Cassandra, DynamoDB).</p>\n<p>Work perks at Greenlight include medical, dental, vision, and HSA match, paid life insurance, AD&amp;D, and disability benefits, traditional 401k with company match, unlimited PTO, paid company holidays and pop-up bonus holidays, professional development stipends, mental health resources, 1:1 financial planners, fertility healthcare, 100% paid parental and caregiving leave, plus cleaning service and meals during your leave, flexible WFH, both remote and in-office opportunities, fully stocked kitchen, catered lunches, and occasional in-office happy hours, employee resource groups.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5242ca9a-088","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Greenlight","sameAs":"https://www.greenlight.com/","logo":"https://logos.yubhub.co/greenlight.com.png"},"x-apply-url":"https://jobs.lever.co/greenlight/d85a9c34-4434-4f6d-8f01-bccb9521c036","x-work-arrangement":"hybrid","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$180,000-$225,000","x-skills-required":["Agentic AI","LLM-powered workflows","Tines","AWS Agentcore","AWS Bedrock","Claude Code","Infrastructure as Code (IaC)","Terraform","REST APIs","JSON","webhooks","JavaScript","Python","Git","modern version control practices","identity systems","RBAC concepts","cloud environments","Google Cloud Platform (GCP)","Amazon Web Services (AWS)"],"x-skills-preferred":["data platform and database technologies","Snowflake","PostgreSQL","Cassandra","DynamoDB"],"datePosted":"2026-04-17T12:35:33.366Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Agentic AI, LLM-powered workflows, Tines, AWS Agentcore, AWS Bedrock, Claude Code, Infrastructure as Code (IaC), Terraform, REST APIs, JSON, webhooks, JavaScript, Python, Git, modern version control practices, identity systems, RBAC concepts, cloud environments, Google Cloud Platform (GCP), Amazon Web Services (AWS), data platform and database technologies, Snowflake, PostgreSQL, Cassandra, DynamoDB","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":225000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c330fc0f-900"},"title":"Senior Manager, Customer Success Operations","description":"<p>At Eve, we&#39;re redefining what&#39;s possible in legal technology. Our mission is to empower plaintiff law firms with AI-driven solutions that elevate how they operate, serve clients, and grow.</p>\n<p>We believe the future of law will be built by &#39;AI-Native Law Firms&#39; , firms that are managed, scaled, and optimized by intelligent systems rather than manual processes and endless administrative work.</p>\n<p>Our technology augments the capabilities of attorneys across every stage of a case , from intake and document review to strategy and settlement , so they can focus on what truly matters: achieving the best outcomes for their clients.</p>\n<p>Why Join Eve:</p>\n<p>Product-market fit: Eve is used by over 550+ law firms, and we&#39;re growing fast. Backed by top investors: We&#39;ve raised over $160M from world-class partners including Spark Capital, Andreessen Horowitz(A16z), Menlo Ventures, and Lightspeed. Built by a world-class team: Engineers, designers, and operators from places like Scale, Meta, Airbnb, Cruise, Square, Rubrik, and Lyft are building Eve from the ground up. AI-Native from day one: We&#39;re on the bleeding edge of AI, collaborating directly with teams at OpenAI and Anthropic to build best-in-class AI workflows tailored for legal work. Explosive growth: We are growing 2X revenue Quarter over Quarter.</p>\n<p><strong>Why This Role:</strong></p>\n<p>You&#39;ll build the data, process, and tooling layer for Eve&#39;s Customer Success organization. We have 900+ accounts, a growing renewal book, and a CS team that&#39;s scaling fast. Health scores need to be rebuilt. Renewal forecasting needs to be trusted by Finance. Playbooks exist in theory but aren&#39;t operationalized. The data is there (Snowflake, product telemetry, CRM), but it&#39;s not connected to the workflows that CSMs run every day. This is not a reporting role. You&#39;ll spend most of your time designing systems: building the health score methodology that predicts churn, operationalizing playbooks, and standing up renewal forecasting that leadership and Finance trust. The rest of your time goes to capacity planning, territory design, and partnering with RevOps and Finance to make sure CS metrics are calculated consistently across the business.</p>\n<p><strong>What You&#39;ll Accomplish:</strong></p>\n<ul>\n<li>Own CS analytics end-to-end: health scoring methodology, renewal forecasting, portfolio segmentation, and the dashboards that leadership and Finance rely on for decision-making</li>\n<li>Design and operationalize playbooks that drive customer outcomes: onboarding workflows, risk intervention sequences, expansion motions, renewal execution</li>\n<li>Use AI tools as part of your daily workflow: automating data pulls, building smarter alerting, finding patterns that manual analysis would miss</li>\n<li>Build and maintain the CS tech stack. Clean data, working automations, tooling the team relies on.</li>\n</ul>\n<p>Partner with GTM Systems on shared infrastructure across the revenue stack</p>\n<ul>\n<li>Deliver capacity planning, territory design, and comp modeling that keeps pace with a growing team and portfolio</li>\n<li>Identify patterns across the portfolio: what&#39;s driving churn and where onboarding stalls. Turn those into specific actions for CSMs and leadership</li>\n<li>Partner with RevOps and Finance to make sure CS metrics (GRR, NRR, renewal forecasts) are calculated consistently, trusted across the business, and ready for executive reporting</li>\n</ul>\n<p><strong>What We Are Looking For:</strong></p>\n<ul>\n<li>5-8+ years in CS Operations, Revenue Operations, or Business Operations at a B2B SaaS company, with at least 2 years focused on Customer Success</li>\n<li>Hands-on experience building health scores, renewal forecasting models, and portfolio segmentation frameworks</li>\n<li>Strong SQL skills and comfort working directly with data warehouses</li>\n<li>Experience implementing and administering CS platforms, including configuration, workflow design, and driving adoption across a team</li>\n<li>Track record of designing processes that CSMs actually follow</li>\n<li>Ability to translate data into narrative for leadership: what&#39;s happening, why it matters, what to do about it</li>\n<li>Comfortable using AI tools (Claude, ChatGPT, Copilot) for analysis, automation, and workflow design</li>\n</ul>\n<p><strong>You&#39;ll Thrive in This Role If You Have:</strong></p>\n<ul>\n<li>Experience with consumption-based or usage-based pricing models</li>\n<li>Background in legal technology or professional services SaaS</li>\n<li>Familiarity with AI/ML products and the adoption challenges they create</li>\n<li>Experience with CS platforms like Gainsight, Vitally, ChurnZero, or Totango</li>\n<li>Experience with Salesforce or HubSpot CRM administration and reporting.</li>\n<li>Familiarity with Snowflake, Looker, or similar BI/warehouse tools</li>\n<li>Background scaling CS operations at a company growing to $100M+ in ARR</li>\n</ul>\n<p><strong>Additional Information</strong></p>\n<p>Benefits:</p>\n<p>Competitive Salary &amp; Equity 401(k) Program with Employer Matching Health, Dental, Vision and Life Insurance Short Term and Long Term Disability Commuter Benefits Autonomous Work Environment Office Setup Reimbursement Flexible Time Off (FTO) + Holidays Quarterly Team Gatherings In office Perks</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c330fc0f-900","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Eve","sameAs":"https://www.eve.com/","logo":"https://logos.yubhub.co/eve.com.png"},"x-apply-url":"https://jobs.lever.co/Eve/acb41a18-4b86-44e7-8d29-eaf00b6b9091","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Customer Success Operations","Revenue Operations","Business Operations","SQL","Data Warehouses","CS Platforms","Gainsight","Vitally","ChurnZero","Totango","Salesforce","HubSpot","CRM Administration","Reporting","AI Tools","Claude","ChatGPT","Copilot"],"x-skills-preferred":["Consumption-Based Pricing Models","Usage-Based Pricing Models","Legal Technology","Professional Services SaaS","AI/ML Products","Adoption Challenges","Snowflake","Looker","BI/Warehouse Tools"],"datePosted":"2026-04-17T12:30:24.125Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Operations","industry":"Technology","skills":"Customer Success Operations, Revenue Operations, Business Operations, SQL, Data Warehouses, CS Platforms, Gainsight, Vitally, ChurnZero, Totango, Salesforce, HubSpot, CRM Administration, Reporting, AI Tools, Claude, ChatGPT, Copilot, Consumption-Based Pricing Models, Usage-Based Pricing Models, Legal Technology, Professional Services SaaS, AI/ML Products, Adoption Challenges, Snowflake, Looker, BI/Warehouse Tools"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_f0f321c2-15d"},"title":"Data Platform Engineer","description":"<p>At Anchorage Digital, we are building the world&#39;s most advanced digital asset platform for institutions to participate in crypto. Join the Data Platform team and build the Trusted Data Platform that powers Anchorage&#39;s transition to Data 3.0.</p>\n<p>You&#39;ll help shape the unified orchestration foundation, collaborate on governance-as-code patterns, and contribute to self-service frameworks that make quality and compliance automatic. We&#39;re moving from manual spreadsheets and theoretical architectures to automated control planes where every dataset is trusted, monitored, and traceable by default.</p>\n<p><strong>Technical Skills:</strong></p>\n<ul>\n<li>Collaborate on designing and implementing unified orchestration patterns (Dagster/Airflow) to replace legacy and fragmented scheduling</li>\n<li>Develop governance-as-code systems in partnership with the team that automatically apply policy tags, RLS, and access controls through an active control plane</li>\n</ul>\n<p><strong>Complexity and Impact of Work:</strong></p>\n<ul>\n<li>Help guide the technical design for platform capabilities like data contracts, automated quality gating, observability, and cost visibility</li>\n<li>Support the migration of workloads from legacy patterns to the modern platform, ensuring domain teams have clear paths and golden templates</li>\n</ul>\n<p><strong>Organizational Knowledge:</strong></p>\n<ul>\n<li>Partner with domain teams (Asset Data, Reporting &amp; Statements, Product teams) to understand their needs and design platform capabilities that enable their success</li>\n<li>Promote and support data mesh principles and dbt best practices, helping domain owners build and own their data products while platform ensures quality</li>\n</ul>\n<p><strong>Communication and Influence:</strong></p>\n<ul>\n<li>Promote data platform engineering best practices, developer experience, and &#39;Data as a Product&#39; principles across the engineering organization</li>\n<li>Contribute to architectural decisions and help establish engineering culture around reliability, cost efficiency, and operational excellence</li>\n</ul>\n<p><strong>You may be a fit for this role if you:</strong></p>\n<ul>\n<li>5-7+ years building data platforms or infrastructure: You bring experience helping design and operate modern data platforms that handle enterprise-scale workloads with quality, governance, and cost controls</li>\n<li>Strong dbt and SQL expertise: You&#39;re proficient with dbt and SQL, understand dbt Mesh, and have strong opinions on data modeling, testing, and documentation best practices</li>\n<li>Orchestration experience: You&#39;ve implemented production data orchestration with Airflow, Dagster, Prefect, or similar tools, and understand the trade-offs between different orchestration patterns</li>\n<li>Cloud data warehouse proficiency: You have strong experience with BigQuery, Snowflake, or Redshift, including query optimization, cost management, and security configurations</li>\n<li>Platform mindset: You think in terms of golden paths, reusable abstractions, and developer experience - you build systems that let others move fast safely</li>\n</ul>\n<p><strong>Although not a requirement, bonus points if:</strong></p>\n<ul>\n<li>Metadata and catalog experience: You&#39;ve worked with Atlan, Collibra, DataHub, or similar metadata platforms and understand active governance patterns</li>\n<li>Data observability tools: You&#39;ve implemented data quality monitoring with Great Expectations, Monte Carlo, Soda, or similar tools</li>\n<li>Infrastructure as code: You have experience with Terraform, Kubernetes, and modern DevOps practices for data infrastructure</li>\n<li>You&#39;re the kind of person who gets excited about declarative config, immutable infrastructure, and metrics dashboards showing cost-per-query trending down</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_f0f321c2-15d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anchorage Digital","sameAs":"https://www.anchorage.co/","logo":"https://logos.yubhub.co/anchorage.co.png"},"x-apply-url":"https://jobs.lever.co/anchorage/8a325cd5-ef99-4f1e-bba8-7bb1fca64f12","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["dbt","SQL","Airflow","Dagster","Prefect","BigQuery","Snowflake","Redshift"],"x-skills-preferred":["Metadata and catalog experience","Data observability tools","Infrastructure as code"],"datePosted":"2026-04-17T12:24:40.602Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"dbt, SQL, Airflow, Dagster, Prefect, BigQuery, Snowflake, Redshift, Metadata and catalog experience, Data observability tools, Infrastructure as code"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3d849fbc-058"},"title":"Member of Product, Data Platform","description":"<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto.</p>\n<p>The Data Platform team is the backbone of Anchorage Digital&#39;s information infrastructure. As data becomes the lifeblood of every product, compliance workflow, and client-facing report we produce, this team is responsible for building and operating a unified, scalable, and reliable data platform that serves the entire organization.</p>\n<p>As a Data Platform Product Manager, you will own the strategy and execution for centralizing and formalizing the company&#39;s data infrastructure , spanning internal operational data, transaction and blockchain data, customer data, and external data sources.</p>\n<p>Your mission is to transform a fragmented data landscape into a single source of truth that powers mission-critical reporting, business insights, and downstream product experiences across every team at Anchorage.</p>\n<p>This is a force-multiplier role. Your work will elevate the quality, speed, and reliability of every product and team at the company.</p>\n<p>You will define the standards, build the platform, and create the foundation that enables Anchorage to scale with confidence.</p>\n<p>If you thrive at the intersection of complex data systems, cross-functional influence, and platform thinking, this is your opportunity to have outsized impact at a category-defining company in digital assets.</p>\n<p>Below, we define our Factors of Growth &amp; Impact to help Anchorage Villagers measure their impact and articulate feedback, coaching, and the rich learning that happens while exploring, developing, and mastering capabilities within and beyond the Member of Product, Data Platform role:</p>\n<p><strong>Technical Skills:</strong></p>\n<ul>\n<li>Own the detailed prioritization of the data platform roadmap, balancing foundational infrastructure work, new capabilities, and technical debt.</li>\n<li>Demonstrate deep strategic thinking in shaping the platform roadmap, considering the unique data challenges of digital assets, blockchain protocols, and regulated financial services.</li>\n<li>Deliver complex, cross-functional projects with multiple dependencies across engineering, analytics, compliance, and operations teams.</li>\n<li>Work closely with engineering and data science counterparts to drive product development processes, sprint planning, and architectural decisions.</li>\n<li>Ability to understand and reason about system architecture , including data warehousing, ETL/ELT pipelines, streaming vs. batch processing, and modern data stack components , and communicate clear requirements to engineering.</li>\n<li>Drive comprehensive go-to-market strategy for internal platform adoption, including defining success metrics, tracking KPIs around data quality and platform usage, and iterating based on data-driven insights.</li>\n</ul>\n<p><strong>Complexity and Impact of Work:</strong></p>\n<ul>\n<li>Lead and influence cross-functional teams while maintaining strong stakeholder relationships across the entire organization , from engineering to finance to compliance.</li>\n<li>Exercise independent decision-making and take full ownership of data platform strategy and execution.</li>\n<li>Contribute strategic insights that significantly impact company direction, operational efficiency, and product quality.</li>\n<li>Demonstrate platform leadership that elevates the performance and effectiveness of every team that depends on data.</li>\n</ul>\n<p><strong>Organizational Knowledge:</strong></p>\n<ul>\n<li>Develop deep understanding of Anchorage&#39;s business model, product suite, regulatory environment, and organizational structure.</li>\n<li>Build and maintain strong relationships with stakeholders across all departments to ensure the data platform serves the company&#39;s most critical needs.</li>\n<li>Navigate and improve organizational data practices to enhance efficiency, compliance, and decision-making.</li>\n<li>Drive company objectives through strategic data platform decisions and initiatives.</li>\n</ul>\n<p><strong>Communication and Influence:</strong></p>\n<ul>\n<li>Effectively influence and motivate teams across the organization to adopt platform standards and invest in data quality, even when those teams do not report to you.</li>\n<li>Enable cross-functional collaboration through clear, consistent communication about platform capabilities, timelines, and data governance expectations.</li>\n<li>Act as a thoughtful knowledge partner to senior leadership, translating complex data infrastructure topics into clear business impact.</li>\n<li>Proactively communicate platform goals, status updates, and data health metrics throughout the organization.</li>\n</ul>\n<p><strong>You may be a fit for this role if you:</strong></p>\n<ul>\n<li>5+ years of product management experience, with significant time spent on data platforms, data infrastructure, or data-intensive enterprise products.</li>\n<li>Proven experience building or scaling enterprise data platforms , including data warehousing, data lakes, ETL/ELT pipelines, or modern data stack tooling (e.g., Snowflake, Databricks, dbt, Airflow, Spark).</li>\n<li>Strong understanding of data modeling, data governance, and data quality frameworks.</li>\n<li>Experience working with diverse data types , including transactional data, customer data, financial data, and ideally blockchain or on-chain data.</li>\n<li>Track record of driving cross-functional alignment and adoption for internal platform products where you must influence without direct authority.</li>\n<li>Exceptional written and verbal communication skills, with the ability to convey complex data architecture concepts to both technical and non-technical audiences.</li>\n<li>Your empathy and adaptability not only complement others&#39; working styles but also embody our culture of curiosity, creativity, and shared understanding.</li>\n<li>You self describe as some combination of the following: creative, humble, ambitious, detail oriented, hard working, trustworthy, eager to learn, methodical, action oriented, and tenacious.</li>\n</ul>\n<p><strong>Although not a requirement, bonus points if you have:</strong></p>\n<ul>\n<li>You have hands-on experience with blockchain data indexing, onchain analytics, or crypto-native data infrastructure.</li>\n<li>You have built data platforms that serve both internal analytics consumers and external client-facing products (reports, statements, dashboards).</li>\n<li>You have experience supporting clients with data-related issues or concerns.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3d849fbc-058","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anchorage Digital","sameAs":"https://anchorage.com","logo":"https://logos.yubhub.co/anchorage.com.png"},"x-apply-url":"https://jobs.lever.co/anchorage/0e730f61-a2e4-4152-8277-3f6383cc69a6","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data platforms","data infrastructure","data-intensive enterprise products","data warehousing","data lakes","ETL/ELT pipelines","modern data stack tooling","Snowflake","Databricks","dbt","Airflow","Spark","data modeling","data governance","data quality frameworks","blockchain or on-chain data"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:18:21.529Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data platforms, data infrastructure, data-intensive enterprise products, data warehousing, data lakes, ETL/ELT pipelines, modern data stack tooling, Snowflake, Databricks, dbt, Airflow, Spark, data modeling, data governance, data quality frameworks, blockchain or on-chain data"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1378ad18-3c9"},"title":"Data Team Leader","description":"<p>We are looking for an outstanding Data Team Leader to join our motivated engineering team. As a Data Team Leader, you will lead a dedicated group of data engineers, ensuring the successful implementation of innovative data pipelines and architectures.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Lead and coordinate a team of data engineers, ensuring delivery across multiple projects.</li>\n<li>Architect and implement scalable, high-performance data pipelines using Snowflake, dbt, and Airflow.</li>\n<li>Apply and guide others in using distributed systems and queueing technologies such as Celery, Redis, or equivalents.</li>\n<li>Own the end-to-end data lifecycle: ingestion, modeling, transformation, and delivery.</li>\n<li>Partner with cross-functional teams (product, analytics, DevOps) to meet business data needs.</li>\n<li>Enforce engineering guidelines, code quality, and performance standards.</li>\n<li>Conduct regular 1:1s, technical reviews, and provide mentorship to team members.</li>\n<li>Take initiative in capacity planning, hiring, and team scaling decisions.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li><p>5+ years of hands-on experience in data engineering.</p>\n</li>\n<li><p>2+ years of formal team leadership experience, including people management and project ownership.</p>\n</li>\n<li><p>Advanced knowledge of:</p>\n<ul>\n<li>Snowflake for warehousing and performance tuning.</li>\n<li>dbt for modular data modeling and testing.</li>\n<li>Apache Airflow (or similar workflow orchestrators).</li>\n<li>Distributed task and caching systems such as Celery, Redis, or similar technologies.</li>\n<li>Python, SQL, and shell scripting.</li>\n</ul>\n</li>\n<li><p>Experience with cloud platforms such as AWS, Azure, or GCP.</p>\n</li>\n<li><p>Strong grasp of software development guidelines, CI/CD, and data observability.</p>\n</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Experience with real-time data streaming (e.g., Kafka).</li>\n<li>Familiarity with Terraform or other infrastructure-as-code tools.</li>\n<li>Prior experience in startup or high-growth environments.</li>\n<li>Exposure to BI platforms (e.g., Power BI, Looker, Tableau).</li>\n</ul>\n<p>Why Aristocrat?</p>\n<p>Aristocrat is a world leader in gaming content and technology, and a top-tier publisher of free-to-play mobile games. We deliver great performance for our B2B customers and bring joy to the lives of the millions of people who love to play our casino and mobile games.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1378ad18-3c9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Aristocrat","sameAs":"https://www.aristocrat.com/","logo":"https://logos.yubhub.co/aristocrat.com.png"},"x-apply-url":"https://aristocrat.wd3.myworkdayjobs.com/en-US/AristocratExternalCareersSite/job/Noida-UP-IN/Data-Team-Leader_R0020618","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Snowflake","dbt","Apache Airflow","Celery","Redis","Python","SQL","shell scripting","AWS","Azure","GCP","software development guidelines","CI/CD","data observability"],"x-skills-preferred":["real-time data streaming","Terraform","infrastructure-as-code tools","BI platforms"],"datePosted":"2026-03-10T12:13:54.011Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Noida, UP, IN"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Snowflake, dbt, Apache Airflow, Celery, Redis, Python, SQL, shell scripting, AWS, Azure, GCP, software development guidelines, CI/CD, data observability, real-time data streaming, Terraform, infrastructure-as-code tools, BI platforms"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b23d070f-1b3"},"title":"FBS Sr Data Engineer (Insurance Experience)","description":"<p>FBS – Farmer Business Services is part of Farmers operations. We&#39;re building a global approach to identifying, recruiting, hiring, and retaining top talent. Our goal is to create diverse and high-performing teams that thrive in today&#39;s competitive marketplace.</p>\n<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That&#39;s where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>As a Data Engineer with a strong P&amp;C insurance background, you&#39;ll analyse business data stories and translate them into technical requirements. You&#39;ll design, build, test, and implement data products of various complexity with minimal guidance.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Acquire, curate, and publish data for analytical or operational uses</li>\n<li>Ensure data is in a ready-to-use form that creates a single version of the truth across all data consumers</li>\n<li>Translate business analytic requests/requirements into design, development, testing, deployment, and production maintenance tasks</li>\n<li>Work independently with various technologies from big data, relational and non-relational databases, cloud environments, different programming languages, and various reporting tools</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>4-6 years of experience as a Data Engineer with ETL using SQL</li>\n<li>At least 2 years Insurance Background – Ideally P&amp;C, flexible for insurance in other areas Health, Life if upskilling is possible</li>\n<li>Advanced SQL skills</li>\n<li>Advanced DBT/ Informatica skills</li>\n<li>Intermediate Snowflake skills</li>\n<li>Intermediate Python/ PySpark skills</li>\n<li>Intermediate Shell Scripting skills</li>\n<li>Intermediate Power BI skills</li>\n<li>Strong communication and problem-solving skills</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<ul>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Private Health Insurance</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b23d070f-1b3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/5qzBuoS2KyVBegQHDpzsPN/remote-fbs-sr-data-engineer-(insurance-experience)-in-mexico-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","DBT/ Informatica","Snowflake","Python/ PySpark","Shell Scripting","Power BI"],"x-skills-preferred":["Agile methodology","Software development","data development","Commercial / Business Insurance"],"datePosted":"2026-03-09T17:07:40.702Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, DBT/ Informatica, Snowflake, Python/ PySpark, Shell Scripting, Power BI, Agile methodology, Software development, data development, Commercial / Business Insurance"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b0e6c6db-b61"},"title":"FBS - Finance Mgr II-Bus Ops Fin (Consultant)","description":"<p><strong>Finance Manager II - Business Operations Finance (Consultant)</strong></p>\n<p>Our client is a leading insurance company seeking a Finance Manager II - Business Operations Finance (Consultant) to provide business unit finance support by partnering with the business unit in developing and executing on business strategies and tactical operations to achieve overall company goals.</p>\n<p><strong>Essential Job Functions</strong></p>\n<ul>\n<li>Provide business insight to the current financial situation of the business through analysis of key financial indicators on a regular basis.</li>\n<li>Work closely with the business unit to determine the risks and opportunities of the current financial climate and how best to mitigate risks and capitalize on opportunities.</li>\n<li>Identify financial opportunities and partner with business partner to develop, plan and execute on the opportunity with metrics and deliverables.</li>\n<li>Conduct competitor intelligence.</li>\n<li>Present complex ideas and concepts to business leaders and executives.</li>\n<li>Implement policies and procedures.</li>\n<li>Promote safety at all times and comply with safety/ergonomic standards as outlined in relevant company published manuals.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>6 years of prior Accounting or Finance experience including 1 year as an Accounting or Finance Supervisor and 1 year as an Accounting Manager or Supervisor.</li>\n<li>Bachelor’s degree required in Accounting, Finance or related field. CPA, CMA, MBA, Masters or similar preferred.</li>\n<li>English Proficiency: Fluent.</li>\n<li>Intermediate skills in Excel, SQL/Snowflake, and PowerPoint.</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Career development and training opportunities.</li>\n<li>Dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health and Dental Insurance.</li>\n<li>Pension Plan.</li>\n<li>Meals tickets.</li>\n<li>Life Insurance.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b0e6c6db-b61","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/5sFZnHiQE3NvECCAK5NxDq/hybrid-fbs---finance-mgr-ii-bus-ops-fin-(consultant)-in-bogot%C3%A1-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Excel","SQL/Snowflake","PowerPoint","Accounting","Finance"],"x-skills-preferred":[],"datePosted":"2026-03-09T17:06:35.694Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bogotá, Bogota, Colombia"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"Excel, SQL/Snowflake, PowerPoint, Accounting, Finance"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6b65ab0d-2d3"},"title":"FBS - Finance Mgr II-Bus Ops Fin (Consultant)","description":"<p><strong>Finance Manager II - Business Operations Finance (Consultant)</strong></p>\n<p>Our client is a leading insurance company seeking a Finance Manager II to provide business unit finance support. The successful candidate will partner with the business unit to develop and execute on business strategies and tactical operations to achieve overall company goals.</p>\n<p><strong>Essential Job Functions</strong></p>\n<ul>\n<li>Provide business insight to the current financial situation of the business through analysis of key financial indicators on a regular basis.</li>\n<li>Work closely with the business unit to determine the risks and opportunities of the current financial climate and how best to mitigate risks and capitalize on opportunities.</li>\n<li>Identify financial opportunities and partner with business partner to develop, plan and execute on the opportunity with metrics and deliverables.</li>\n<li>Conduct competitor intelligence.</li>\n<li>Present complex ideas and concepts to business leaders and executives.</li>\n<li>Implement policies and procedures.</li>\n<li>Promote safety at all times and comply with safety/ergonomic standards as outlined in relevant company published manuals.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Experience Requirements:</li>\n<li>External candidates should have a minimum of 6 years of prior Accounting or Finance experience including 1 year as an Accounting or Finance Supervisor and 1 year as an Accounting Manager or Supervisor.</li>\n<li>The same time-on-job requirement is preferred for internal candidates.</li>\n<li>Education Requirements:</li>\n<li>Bachelor’s degree required in Accounting, Finance or related field.</li>\n<li>CPA, CMA, MBA, Masters or similar preferred.</li>\n<li>English Proficiency:</li>\n<li>Minimum Required: Fluent</li>\n<li>Software / Tool Skills:</li>\n<li>Excel - Intermediate</li>\n<li>SQL/Snowflake - Intermediate</li>\n<li>PowerPoint - Intermediate</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package:</p>\n<ul>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Career development and training opportunities</li>\n<li>Dynamic and inclusive work culture within a globally renowned group</li>\n<li>Private Health and Dental Insurance</li>\n<li>Pension Plan</li>\n<li>Meals tickets</li>\n<li>Life Insurance</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6b65ab0d-2d3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/h3EW59g5aTA2mrJ8gFQauZ/hybrid-fbs---finance-mgr-ii-bus-ops-fin-(consultant)-in-aguascalientes-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Excel","SQL/Snowflake","PowerPoint","Accounting","Finance"],"x-skills-preferred":[],"datePosted":"2026-03-09T17:04:51.820Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Aguascalientes, Aguascalientes, Mexico"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"Excel, SQL/Snowflake, PowerPoint, Accounting, Finance"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b4eae1f8-e30"},"title":"FBS - Finance Mgr II-Bus Ops Fin (Consultant)","description":"<p><strong>Finance Manager II - Business Operations Finance</strong></p>\n<p>Our client is a leading insurance company seeking a Finance Manager II - Business Operations Finance to join their team. As a key member of the finance team, you will provide business unit finance support by partnering with the business unit in developing and executing on business strategies and tactical operations to achieve overall company goals.</p>\n<p><strong>Essential Job Functions</strong></p>\n<ul>\n<li>Provide business insight to the current financial situation of the business through analysis of key financial indicators on a regular basis.</li>\n<li>Work closely with the business unit to determine the risks and opportunities of the current financial climate and how best to mitigate risks and capitalize on opportunities.</li>\n<li>Identify financial opportunities and partner with business partner to develop, plan and execute on the opportunity with metrics and deliverables.</li>\n<li>Conduct competitor intelligence.</li>\n<li>Present complex ideas and concepts to business leaders and executives.</li>\n<li>Implement policies and procedures.</li>\n<li>Promote safety at all times and comply with safety/ergonomic standards as outlined in relevant company published manuals.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>6 years of prior Accounting or Finance experience, including 1 year as an Accounting or Finance Supervisor and 1 year as an Accounting Manager or Supervisor.</li>\n<li>Bachelor&#39;s degree in Accounting, Finance or related field.</li>\n<li>CPA, CMA, MBA, Masters or similar preferred.</li>\n<li>English Proficiency: Fluent.</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package, including:</p>\n<ul>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Career development and training opportunities</li>\n<li>Dynamic and inclusive work culture within a globally renowned group</li>\n<li>Private Health and Dental Insurance</li>\n<li>Pension Plan</li>\n<li>Meals tickets</li>\n<li>Life Insurance</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b4eae1f8-e30","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/k4qF17n5XBwX63gEQsR9bA/hybrid-fbs---finance-mgr-ii-bus-ops-fin-(consultant)-in-mexico-city-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Excel","SQL/Snowflake","PowerPoint"],"x-skills-preferred":[],"datePosted":"2026-03-09T17:04:44.235Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Mexico City"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"Excel, SQL/Snowflake, PowerPoint"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_65e7bd92-c31"},"title":"FBS Analytics Engineer","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>\n<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>\n<p><strong>What to expect on your journey with us:</strong></p>\n<ul>\n<li>A solid and innovative company with a strong market presence</li>\n</ul>\n<ul>\n<li>A dynamic, diverse, and multicultural work environment</li>\n</ul>\n<ul>\n<li>Leaders with deep market knowledge and strategic vision</li>\n</ul>\n<ul>\n<li>Continuous learning and development</li>\n</ul>\n<p><strong>Team Function</strong> The Direct modeling team is focused on creating models to guide enterprise marketing decision that will help to promote brand awareness as well as boost sales through direct channel.</p>\n<p><strong>Role Description:</strong></p>\n<p>This position plays a crucial role in the data ecosystem by iteratively transforming raw data into structured, high-quality datasets that are ready for analysis in partnership with data/decision scientists. The role primarily focuses on moderately complex business problems while receiving limited coaching and guidance from data leadership. The role combines the technical skills of a data engineer, the analytical mindset of a data analyst, and strong business acumen to ensure data is not only collected and stored efficiently but also made accessible and insightful for end users. In partnership with data/decision scientists, the position is responsible for end-to-end data workflow including data ingestion, transformation, modeling, and validation to enable data-driven decision-making across the organization. This position requires deep understanding of data engineering, business processes, and analytics principles as well as a proactive approach to solving complex data challenges.</p>\n<p><strong>Essential Job Functions:</strong></p>\n<p><strong>1) Data infrastructure development</strong>: Pipeline Design and Development; Architects and builds scalable data pipelines using modern ETL (Extract, Load, Transform) tools and frameworks such as dbt (Data Build Tool), Apache Airflow, or similar. Automates data ingestion processes from various sources including databases, APIs, and third party services. Data Storage and Management - Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery. Optimizes storage solutions for performance, cost efficiency, and scalability.</p>\n<p><strong>2) Data modeling and transformation:</strong> Data Modeling - Develops and maintains logical and physical data models to support business analytics. Creates and manages dimensional models, star/snowflake schemas, and other data structures. Data Transformation - Transforms raw data into clean, organized, and analytics-ready datasets using SQL, Python, or other relevant languages. Implements data transformation workflows to handle data cleansing, normalization, and enrichment. Data Quality Assurance - Conducts data validation and consistency checks to ensure the accuracy and reliability of data. Implements data quality monitoring and alerting mechanisms.</p>\n<p><strong>3) Collaboration and stakeholder management:</strong> Cross-Functional Collaboration - Works closely with data analysts, data scientists, and business stakeholders to gather requirements and understand their data needs. Acts as a liaison between technical teams and business units to translate business requirements into technical specifications. Technical Communication - Clearly communicates complex technical concepts and data insights to non-technical stakeholders. Provides training and support to team members on data tools, best practices, and methodologies.</p>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Over 4 years of experience in data development and analytics engineering using Python, SQL, DBT and Snowflake.</li>\n</ul>\n<ul>\n<li>Bachelor’s degree in Computer Science, Data Science, Engineering or other Math or Technology related degrees.</li>\n</ul>\n<ul>\n<li>Fluency in English</li>\n</ul>\n<p><strong>Software / Tools</strong></p>\n<ul>\n<li>SQL (must have)</li>\n</ul>\n<ul>\n<li>Python (must have)</li>\n</ul>\n<ul>\n<li>Snowflake (must have)</li>\n</ul>\n<ul>\n<li>DBT (must have)</li>\n</ul>\n<p><strong>Other Critical Skills</strong></p>\n<ul>\n<li>Data Transformation</li>\n</ul>\n<ul>\n<li>Data Quality Assurance</li>\n</ul>\n<ul>\n<li>Pipeline Design and Development</li>\n</ul>\n<ul>\n<li>Technical Communication</li>\n</ul>\n<ul>\n<li>Independent work</li>\n</ul>\n<ul>\n<li>Orientation to detail</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n</ul>\n<ul>\n<li>Comprehensive benefits package.</li>\n</ul>\n<ul>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n</ul>\n<ul>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n</ul>\n<ul>\n<li>Private Health Insurance.</li>\n</ul>\n<ul>\n<li>Paid Time Off.</li>\n</ul>\n<ul>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_65e7bd92-c31","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/ws76jLTZQ1JKbCcs3CUiC4/remote-fbs-analytics-engineer-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Python","Snowflake","DBT","Data Transformation","Data Quality Assurance","Pipeline Design and Development","Technical Communication","Independent work","Orientation to detail"],"x-skills-preferred":[],"datePosted":"2026-03-09T17:02:50.114Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, Snowflake, DBT, Data Transformation, Data Quality Assurance, Pipeline Design and Development, Technical Communication, Independent work, Orientation to detail"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b258bd6d-41a"},"title":"FBS - Finance Mgr II-Bus Ops Fin (Consultant)","description":"<p><strong>Job Summary</strong></p>\n<p>Provides business unit (Personal Lines, Claims, Commercial Lines) finance support by partnering with the business unit in developing and executing on business strategies and tactical operations to achieve overall company goals.</p>\n<p><strong>Essential Job Functions</strong></p>\n<ul>\n<li>Provide business insight to the current financial situation of the business through analysis of key financial indicators on a regular basis.</li>\n<li>Works closely with the business unit to determine the risks and opportunities of the current financial climate and how best to mitigate risks and capitalize on opportunities.</li>\n<li>Identify financial opportunities and partner with business partner to develop, plan and execute on the opportunity with metrics and deliverables.</li>\n<li>Conduct competitor intelligence.</li>\n<li>Present complex ideas and concepts to business leaders and executives.</li>\n<li>Implement policies and procedures.</li>\n<li>Promotes safety at all times and complies with safety/ergonomic standards as outlined in relevant company published manuals.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<p><strong>Experience Requirements</strong></p>\n<p>External candidates should have a minimum of 6 years of prior Accounting or Finance experience including 1 year as an Accounting or Finance Supervisor and 1 year as an Accounting Manager or Supervisor. The same time-on-job requirement is preferred for internal candidates.</p>\n<p><strong>Education Requirements</strong></p>\n<p>Bachelor’s degree required in Accounting, Finance or related field. CPA, CMA, MBA, Masters or similar preferred.</p>\n<p><strong>English Proficiency</strong></p>\n<p>Minimum Required: Fluent</p>\n<p><strong>Software / Tool Skills</strong></p>\n<ul>\n<li>Excel - Intermediate</li>\n<li>SQL/Snowflake - Intermediate</li>\n<li>PowerPoint - Intermediate</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with competitive compensation and benefits package:</p>\n<ol>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Career development and training opportunities</li>\n<li>Dynamic and inclusive work culture within a globally renowned group</li>\n<li>Private Health and Dental Insurance</li>\n<li>Pension Plan</li>\n<li>Meals tickets</li>\n<li>Life Insurance</li>\n</ol>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b258bd6d-41a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/rE5o92wDibcEkvwETwAMt3/hybrid-fbs---finance-mgr-ii-bus-ops-fin-(consultant)-in-s%C3%A3o-paulo-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Excel","SQL/Snowflake","PowerPoint","Accounting","Finance"],"x-skills-preferred":[],"datePosted":"2026-03-09T17:00:41.699Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"São Paulo, State of São Paulo, Brazil"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"Excel, SQL/Snowflake, PowerPoint, Accounting, Finance"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7af16166-8fd"},"title":"FBS Senior Data Domain Architect","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p><strong>What to expect on your journey with us:</strong></p>\n<ul>\n<li>A solid and innovative company with a strong market presence</li>\n<li>A dynamic, diverse, and multicultural work environment</li>\n<li>Leaders with deep market knowledge and strategic vision</li>\n<li>Continuous learning and development</li>\n</ul>\n<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>\n<p><strong>Key Responsibilities:</strong></p>\n<ul>\n<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>\n<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>\n<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>\n<li>Reviews high level design to ensure alignment to Solution Architecture.</li>\n<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>\n<li>Mentor developers and create reference implementations/frameworks.</li>\n<li>Partners with System Architects to elaborate capabilities and features.</li>\n<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>Over 6 years of experience as a senior domain architect for Data domains</li>\n<li>Advanced English Level</li>\n<li>Masters&#39; degree (PLUS)</li>\n<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>\n</ul>\n<p><strong>Technical &amp; Business Skills:</strong></p>\n<ul>\n<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>\n<li>Data Architecture / Data Modeling – Advanced (MUST)</li>\n<li>Data Warehouse – Advanced (MUST)</li>\n<li>Cloud Data Platforms - Advanced</li>\n<li>Data Integration Tools – Advanced</li>\n<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>\n<li>Any Cloud - Intermediate (4-6 Years)</li>\n<li>Power BI or Tableau - Intermediate (4-6 Years)</li>\n<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>\n<li>Data Lakehouse – Intermediate (MUST)</li>\n</ul>\n<ul>\n<li>Data Governance - Intermediate</li>\n<li>AI/ML - Entry Level (PLUS)</li>\n<li>Master Data Management - Intermediate</li>\n<li>Operational Data Management - Intermediate</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7af16166-8fd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/jdUFHSPZZjHsgd3TR4R3BS/remote-fbs-senior-data-domain-architect-in-colombia-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["ETL/ELT Tools (Informatica, DBT)","Data Architecture / Data Modeling","Data Warehouse","Cloud Data Platforms","Data Integration Tools","Snowflake or Databricks","Any Cloud","Power BI or Tableau","Data Science tools (Sagemaker, Databricks)","Data Lakehouse"],"x-skills-preferred":["Data Governance","AI/ML","Master Data Management","Operational Data Management"],"datePosted":"2026-03-09T17:00:36.230Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, AI/ML, Master Data Management, Operational Data Management"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5a4be76f-140"},"title":"FBS Marketing Automation & Integration Engineer","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>The team is responsible for architecting and maintaining scalable MarTech solutions, with a focus on data integration, customer journey orchestration, and marketing automation. This team operates within the Data, Tech, and Operations tower of the Direct BU.</p>\n<p>The Marketing Automation &amp; Integration Engineer centers on the implementation and optimization of a MarTech data flow pattern involving Snowflake, Segment, Braze, and other SaaS platforms. Key responsibilities include:</p>\n<ul>\n<li>Design and maintain data pipelines between Snowflake, Segment CDP, Braze, and additional platforms</li>\n<li>Implement real-time and batch data ingestion strategies</li>\n<li>Manage customer event tracking and identity resolution within Segment</li>\n<li>Orchestrate personalized marketing campaigns in Braze using dynamic segmentation and behavioral triggers</li>\n<li>Ensure data integrity and feedback loops from Braze back into Snowflake via Segment</li>\n<li>Automate data transformations and enrichment using scripting languages</li>\n<li>Monitor system performance and troubleshoot integration issues across platforms</li>\n</ul>\n<p>This position comes with competitive compensation and benefits package:</p>\n<ol>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Career development and training opportunities</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Dynamic and inclusive work culture within a globally renowned group</li>\n<li>Private Health Insurance</li>\n<li>Pension Plan</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development</li>\n</ol>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5a4be76f-140","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/qJr4ny8yGpdyCcPXUusbL6/remote-fbs-marketing-automation-%26-integration-engineer-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Segment CDP","Braze","Snowflake","Scripting Languages (Python / JS)","Reverse ETL","Data Orchestration Platforms","Customer Data Schema Design","Data modeling and ETL/ELT Pipeline","API Integrations / Webhooks","Customer journey mapping and automation logic"],"x-skills-preferred":["Familiarity with insurance industry data and customer lifecycle models"],"datePosted":"2026-03-09T17:00:23.276Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Segment CDP, Braze, Snowflake, Scripting Languages (Python / JS), Reverse ETL, Data Orchestration Platforms, Customer Data Schema Design, Data modeling and ETL/ELT Pipeline, API Integrations / Webhooks, Customer journey mapping and automation logic, Familiarity with insurance industry data and customer lifecycle models"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c2b9b052-e51"},"title":"FBS - Associate Decision Scientist","description":"<p><strong>Our Role</strong></p>\n<p>As part of the Decision Science team, this role is central to connecting marketing performance with business outcomes. You&#39;ll transform data into actionable insights that guide strategic decisions across online and call center channels.</p>\n<p><strong>What You&#39;ll Do</strong></p>\n<ul>\n<li>Analyze marketing performance across channels and connect spend to customer value.</li>\n<li>Build and maintain dashboards and data models that accurately reflect marketing KPIs.</li>\n<li>Conduct deep-dive analyses to uncover drivers of performance and identify optimization opportunities.</li>\n<li>Collaborate with stakeholders from Marketing, Product, Finance, and Channel teams to support data-driven decision-making.</li>\n<li>Support broader Decision Science initiatives with analytical expertise.</li>\n<li>Translate complex data insights into clear, actionable recommendations for technical and non-technical audiences.</li>\n<li>Work with SQL dashboards and existing datasets to uncover patterns and solve business challenges.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Strong analytical mindset with demonstrated problem-solving skills.</li>\n<li>Excellent verbal communication and listening skills; basic storytelling ability.</li>\n<li>Experience in marketing analysis and campaign performance evaluation.</li>\n<li>Familiarity with marketing KPIs and conversion funnel optimization.</li>\n<li>Proficient in SQL and data wrangling techniques.</li>\n<li>Conceptual understanding of statistical and machine learning methods.</li>\n<li>Experience with data visualization tools (e.g., Tableau, Power BI).</li>\n<li>Technical aptitude with tools such as Python, R, Snowflake.</li>\n<li>Comfortable working in fast-paced environments and adapting to new technologies.</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package:</p>\n<ol>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Career development and training opportunities</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Dynamic and inclusive work culture within a globally renowned group</li>\n<li>Private Health Insurance</li>\n<li>Pension Plan</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development</li>\n</ol>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c2b9b052-e51","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/eNBdjfQELnW1ZWvvMgykpo/hybrid-fbs---associate-decision-scientist-in-s%C3%A3o-paulo-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data wrangling techniques","statistical and machine learning methods","data visualization tools","Python","R","Snowflake"],"x-skills-preferred":["Tableau","Power BI"],"datePosted":"2026-03-09T17:00:19.344Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"São Paulo, State of São Paulo, Brazil"}},"employmentType":"FULL_TIME","occupationalCategory":"Data Science","industry":"Finance","skills":"SQL, data wrangling techniques, statistical and machine learning methods, data visualization tools, Python, R, Snowflake, Tableau, Power BI"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5aabf454-ae0"},"title":"FBS Analytics Engineer","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>\n<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>\n<p><strong>What to expect on your journey with us:</strong></p>\n<ul>\n<li>A solid and innovative company with a strong market presence</li>\n<li>A dynamic, diverse, and multicultural work environment</li>\n<li>Leaders with deep market knowledge and strategic vision</li>\n<li>Continuous learning and development</li>\n</ul>\n<p><strong>Team Function</strong></p>\n<p>The Direct modeling team is focused on creating models to guide enterprise marketing decision that will help to promote brand awareness as well as boost sales through direct channel.</p>\n<p><strong>Role Description:</strong></p>\n<p>This position plays a crucial role in the data ecosystem by iteratively transforming raw data into structured, high-quality datasets that are ready for analysis in partnership with data/decision scientists. The role primarily focuses on moderately complex business problems while receiving limited coaching and guidance from data leadership. The role combines the technical skills of a data engineer, the analytical mindset of a data analyst, and strong business acumen to ensure data is not only collected and stored efficiently but also made accessible and insightful for end users. In partnership with data/decision scientists, the position is responsible for end-to-end data workflow including data ingestion, transformation, modeling, and validation to enable data-driven decision-making across the organization. This position requires deep understanding of data engineering, business processes, and analytics principles as well as a proactive approach to solving complex data challenges.</p>\n<p><strong>Essential Job Functions:</strong></p>\n<p><strong>1) Data infrastructure development</strong>: Pipeline Design and Development; Architects and builds scalable data pipelines using modern ETL (Extract, Load, Transform) tools and frameworks such as dbt (Data Build Tool), Apache Airflow, or similar. Automates data ingestion processes from various sources including databases, APIs, and third party services. Data Storage and Management - Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery. Optimizes storage solutions for performance, cost efficiency, and scalability.</p>\n<p><strong>2) Data modeling and transformation:</strong> Data Modeling - Develops and maintains logical and physical data models to support business analytics. Creates and manages dimensional models, star/snowflake schemas, and other data structures. Data Transformation - Transforms raw data into clean, organized, and analytics-ready datasets using SQL, Python, or other relevant languages. Implements data transformation workflows to handle data cleansing, normalization, and enrichment. Data Quality Assurance - Conducts data validation and consistency checks to ensure the accuracy and reliability of data. Implements data quality monitoring and alerting mechanisms.</p>\n<p><strong>3) Collaboration and stakeholder management:</strong> Cross-Functional Collaboration - Works closely with data analysts, data scientists, and business stakeholders to gather requirements and understand their data needs. Acts as a liaison between technical teams and business units to translate business requirements into technical specifications. Technical Communication - Clearly communicates complex technical concepts and data insights to non-technical stakeholders. Provides training and support to team members on data tools, best practices, and methodologies.</p>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Over 4 years of experience in data development and analytics engineering using Python, SQL, DBT and Snowflake.</li>\n<li>Bachelor’s degree in Computer Science, Data Science, Engineering or other Math or Technology related degrees.</li>\n<li>Fluency in English</li>\n</ul>\n<p><strong>Software / Tools</strong></p>\n<ul>\n<li>SQL (must have)</li>\n<li>Python (must have)</li>\n<li>Snowflake (must have)</li>\n<li>DBT (must have)</li>\n</ul>\n<p><strong>Other Critical Skills</strong></p>\n<ul>\n<li>Data Transformation</li>\n<li>Data Quality Assurance</li>\n<li>Pipeline Design and Development</li>\n<li>Technical Communication</li>\n<li>Independent work</li>\n<li>Orientation to detail</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5aabf454-ae0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/htNwC3gPnBQ9oxedafiBav/remote-fbs-analytics-engineer-in-mexico-at-capgemini","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Python","Snowflake","DBT","Data Transformation","Data Quality Assurance","Pipeline Design and Development","Technical Communication","Independent work","Orientation to detail"],"x-skills-preferred":[],"datePosted":"2026-03-09T17:00:16.662Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, Snowflake, DBT, Data Transformation, Data Quality Assurance, Pipeline Design and Development, Technical Communication, Independent work, Orientation to detail"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a1e677dc-c49"},"title":"FBS Assoc Business Solution Analyst","description":"<p><strong>Job Description</strong></p>\n<p>Our client is a leading insurer seeking a Business Solution Analyst to join their team. As a Business Solution Analyst, you will play a critical role in defining, understanding, quantifying, and explaining metrics for the department. You will work closely with Data Leadership and Data team leads to determine metrics to capture and measure for progress and define how the metric will be defined and calculated.</p>\n<p><strong>Role Responsibilities</strong></p>\n<ul>\n<li>Design, develop, and maintain business intelligence solutions using PowerBI</li>\n<li>Work closely with stakeholders to understand their reporting needs and translate them into actionable insights</li>\n<li>Build and maintain dashboards and reports for the data department</li>\n<li>Expert in visualization technologies such as PowerBI</li>\n<li>Define, capture, and publish metrics ensuring continuous improvement and driving adherence to achieving goals</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Bachelor&#39;s degree</li>\n<li>English proficiency</li>\n<li>Power BI Expertise (4-6 years of experience)</li>\n<li>Strong hands-on experience creating, modifying, and maintaining reports and dashboards</li>\n<li>Comfortable connecting to various backend systems</li>\n<li>Must have experience with real-world Power BI reporting</li>\n<li>Ability to understand stakeholder needs and translate them into effective visuals</li>\n<li>Can communicate data insights clearly and influence teams</li>\n<li>Able to problem-solve and normalize inconsistent data from multiple sources</li>\n<li>Must handle ambiguity and shift priorities well</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package:</p>\n<ol>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Career development and training opportunities</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Dynamic and inclusive work culture within a globally renowned group</li>\n<li>Private Health Insurance</li>\n<li>Pension Plan</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development</li>\n</ol>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a1e677dc-c49","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/2VaJGuY5MDEbDP6Gxc1FB4/remote-fbs-assoc-business-solution-analyst-in-mexico-at-capgemini","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Power BI","Data Analysis","Business Intelligence","Visualization","SQL Server","Snowflake","ETL concepts"],"x-skills-preferred":["Data Quality Dimensions","Data Environments","Team Player","Emotional Intelligence","Influencing Skills"],"datePosted":"2026-03-09T16:59:43.493Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"IT","industry":"Finance","skills":"Power BI, Data Analysis, Business Intelligence, Visualization, SQL Server, Snowflake, ETL concepts, Data Quality Dimensions, Data Environments, Team Player, Emotional Intelligence, Influencing Skills"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7fc50e60-df1"},"title":"FBS - Associate Decision Scientist","description":"<p><strong>Description</strong></p>\n<p>Our client is a leading insurer with a wide range of products and services. As part of the Decision Science team, this role is central to connecting marketing performance with business outcomes.</p>\n<p><strong>About the role</strong></p>\n<p>As an Associate Decision Scientist, you&#39;ll transform data into actionable insights that guide strategic decisions across online and call center channels. You&#39;ll combine analytical rigor with storytelling to help stakeholders understand what&#39;s driving performance and how to optimize future campaigns.</p>\n<p><strong>What You&#39;ll Do</strong></p>\n<ul>\n<li>Analyze marketing performance across channels and connect spend to customer value.</li>\n<li>Build and maintain dashboards and data models that accurately reflect marketing KPIs.</li>\n<li>Conduct deep-dive analyses to uncover drivers of performance and identify optimization opportunities.</li>\n<li>Collaborate with stakeholders from Marketing, Product, Finance, and Channel teams to support data-driven decision-making.</li>\n<li>Support broader Decision Science initiatives with analytical expertise.</li>\n<li>Translate complex data insights into clear, actionable recommendations for technical and non-technical audiences.</li>\n<li>Work with SQL dashboards and existing datasets to uncover patterns and solve business challenges.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Strong analytical mindset with demonstrated problem-solving skills.</li>\n<li>Excellent verbal communication and listening skills; basic storytelling ability.</li>\n<li>Experience in marketing analysis and campaign performance evaluation.</li>\n<li>Familiarity with marketing KPIs and conversion funnel optimization.</li>\n<li>Proficient in SQL and data wrangling techniques.</li>\n<li>Conceptual understanding of statistical and machine learning methods.</li>\n<li>Experience with data visualization tools (e.g., Tableau, Power BI).</li>\n<li>Technical aptitude with tools such as Python, R, Snowflake.</li>\n<li>Comfortable working in fast-paced environments and adapting to new technologies.</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package:</p>\n<ol>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Career development and training opportunities</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Dynamic and inclusive work culture within a globally renowned group</li>\n<li>Private Health Insurance</li>\n<li>Pension Plan</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development</li>\n</ol>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7fc50e60-df1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/eGkj27cvQSz49b7P36NHvq/hybrid-fbs---associate-decision-scientist-in-mexico-city-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data wrangling techniques","statistical and machine learning methods","data visualization tools","Python","R","Snowflake"],"x-skills-preferred":["Tableau","Power BI"],"datePosted":"2026-03-09T16:59:31.579Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Mexico City"}},"employmentType":"FULL_TIME","occupationalCategory":"Marketing","industry":"Finance","skills":"SQL, data wrangling techniques, statistical and machine learning methods, data visualization tools, Python, R, Snowflake, Tableau, Power BI"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7b03b30a-b20"},"title":"FBS Senior Data Domain Architect","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>\n<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>\n<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>\n<p>**Key Responsibilities:*</p>\n<ul>\n<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>\n</ul>\n<ul>\n<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>\n</ul>\n<ul>\n<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>\n</ul>\n<ul>\n<li>Reviews high level design to ensure alignment to Solution Architecture.</li>\n</ul>\n<ul>\n<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>\n</ul>\n<ul>\n<li>Mentor developers and create reference implementations/frameworks.</li>\n</ul>\n<ul>\n<li>Partners with System Architects to elaborate capabilities and features.</li>\n</ul>\n<ul>\n<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7b03b30a-b20","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/1U952YA2QBa8zK7Tm5d3Lm/remote-fbs-senior-data-domain-architect-in-mexico-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["ETL/ELT Tools (Informatica, DBT)","Data Architecture / Data Modeling","Data Warehouse","Cloud Data Platforms","Data Integration Tools","Snowflake or Databricks","Any Cloud","Power BI or Tableau","Data Science tools (Sagemaker, Databricks)","Data Lakehouse","Data Governance","Master Data Management","Operational Data Management"],"x-skills-preferred":["AI/ML"],"datePosted":"2026-03-09T16:59:14.361Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, Master Data Management, Operational Data Management, AI/ML"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9c40b25b-28b"},"title":"FBS Senior Data Engineer (Airflow)","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>We don&#39;t have a local legal entity, so we&#39;ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>\n<p>You can expect a solid and innovative company with a strong market presence, a dynamic, diverse, and multicultural work environment, leaders with deep market knowledge and strategic vision, and continuous learning and development.</p>\n<p>The new data platforms team will be our centralized shared services team supporting all data platforms such as Snowflake and Astronomer. They will be responsible for the strategy and implementation of these platforms as well as best practices for the business units to follow. In this case, the position is focused on Astronomer/Ariflow.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Build and maintain automated data workflows and orchestrations using Apache Airflow</li>\n<li>Implement at least two major end-to-end data pipeline projects using Airflow</li>\n<li>Design and optimize complex DAGs for scalability, maintainability, and reliability</li>\n<li>Create reusable, parameterized, and modular Airflow components (operators, sensors, hooks) to streamline workflow development</li>\n<li>Ensure effective monitoring, alerting, and logging of Airflow DAGs for quick issue resolution</li>\n<li>Document workflows, solutions, and processes for team knowledge sharing and training</li>\n<li>Mentor and support other team members in Airflow usage and adoption</li>\n<li>Explain best practices, identify pros and cons, and communicate technical decisions to team members</li>\n<li>Develop reusable frameworks, leveraging reusable concepts for efficiency and scalability</li>\n<li>Implement and utilize reusable ecosystem components, including Python &amp; Apache Airflow, DynamoDB, Amazon RDS</li>\n<li>Develop reusable frameworks to enforce data governance and data quality standards</li>\n<li>CI/CD pipeline development using re-usable frameworks and Jenkins</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Between 4-6 years of experience in a similar role</li>\n<li>Bachelor&#39;s degree in IT, Information systems, Computer Science or a related field</li>\n<li>Insurance Experience (Desirable)</li>\n<li>Fluency in English</li>\n<li>Availability to work according to CST or PST time zones.</li>\n</ul>\n<p><strong>Technical Skills</strong></p>\n<ul>\n<li>Airflow (MUST) / Astronomer (PLUS) - Advanced (5 Years)</li>\n<li>Python - Advanced (4-6 Years) (MUST)</li>\n<li>Snowflake – Intermediate (MUST)</li>\n<li>DBT - Entry Level (PLUS)</li>\n<li>AWS Glue - Entry Level (PLUS)</li>\n<li>DynamoDB - Intermediate</li>\n<li>Amazon RDS - Intermediate</li>\n<li>Jenkins - Intermediate</li>\n</ul>\n<p><strong>Other Critical Skills</strong></p>\n<ul>\n<li>Work Independently</li>\n<li>Strategic Thinking</li>\n<li>Guide Others</li>\n<li>Documentation</li>\n<li>Explain best practices</li>\n<li>Communicate Technical Decisions</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9c40b25b-28b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/mzMboyMoUyGryzfFUD6uuZ/remote-fbs-senior-data-engineer-(airflow)-in-mexico-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Airflow","Python","Snowflake","DBT","AWS Glue","DynamoDB","Amazon RDS","Jenkins"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:59:05.587Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Airflow, Python, Snowflake, DBT, AWS Glue, DynamoDB, Amazon RDS, Jenkins"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_aafa7b92-fa6"},"title":"Senior Consultant - Data Engineering & Data Science (m/w/d)","description":"<p>Are you looking to advance your career and work with experienced, talented colleagues to successfully solve the most important challenges of our clients? We are growing further and looking for enthusiastic individuals to strengthen our team. You will be part of a dynamic, strongly growing company with over 300,000 employees.</p>\n<p>Our dynamic organisation allows you to work across topics and bring in your ideas, experiences, creativity, and goal orientation. Are you ready?</p>\n<p>As a Consultant/Senior Consultant in the Data Engineering &amp; Data Science field, you will work hands-on on the conception, development, and implementation of modern data and analytics solutions. You will support the entire project lifecycle - from data intake and transformation to analytics and machine learning to productive operation.</p>\n<p>You will work closely with data engineers, architects, data scientists, and subject matter experts to implement scalable, reliable, and value-adding solutions in complex customer environments.</p>\n<p><strong>Your Tasks</strong></p>\n<ul>\n<li>Apply data science methods (machine learning, deep learning, GenAI) to solve concrete business questions</li>\n<li>Work with structured and semi-structured data in data lakes, lakehouses, and data warehouses</li>\n<li>Set up data pipelines for analytical workloads</li>\n<li>Support the productive implementation of data and ML solutions, including monitoring and optimisation</li>\n</ul>\n<p><strong>What You Bring - Required</strong></p>\n<ul>\n<li>At least 3 years of relevant professional experience in the field of data engineering, data science, or analytics</li>\n<li>Hands-on experience in implementing data and analytics solutions in (customer) projects</li>\n<li>Strong problem-solving skills and a pragmatic, implementation-oriented way of working</li>\n</ul>\n<p><strong>Data Engineering Fundamentals</strong></p>\n<ul>\n<li>Experience in setting up data pipelines (ingestion, transformation, storage)</li>\n<li>Solid understanding of data modeling, data transformations, and feature engineering</li>\n<li>Experience with cloud-based data platforms, such as:</li>\n</ul>\n<ol>\n<li>Azure, AWS, or GCP</li>\n<li>Databricks, Snowflake, BigQuery, Azure Synapse/Microsoft Fabric</li>\n</ol>\n<ul>\n<li>Knowledge of CI/CD concepts and production-ready deployments</li>\n</ul>\n<p><strong>Applied Data Science &amp; Analytics</strong></p>\n<ul>\n<li>Experience in applying GenAI, deep learning, and machine learning procedures as well as statistical analyses</li>\n<li>Very good programming skills in Python</li>\n<li>Very good SQL skills and experience with relational databases</li>\n<li>Experience in deploying and productively using ML models</li>\n<li>Ability to translate analytical results into business-relevant insights</li>\n<li>Bachelor&#39;s or master&#39;s degree in computer science, engineering, mathematics, or a related field, or equivalent practical experience</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Experience with:</li>\n</ul>\n<ol>\n<li>Streaming technologies (e.g. Kafka, Azure Event Hubs)</li>\n<li>Time series analysis, NLP applications, or system modeling</li>\n<li>NoSQL databases (e.g. MongoDB, Cosmos DB)</li>\n<li>Docker and Kubernetes</li>\n<li>Data visualization tools like Power BI, Tableau</li>\n<li>Cloud or architecture certifications</li>\n</ol>\n<p><strong>Language &amp; Mobility (Germany)</strong></p>\n<ul>\n<li>Fluent German skills (at least C1) for customer communication in the German-speaking market</li>\n<li>Very good English skills</li>\n<li>Project-related travel readiness</li>\n</ul>\n<p><strong>Your Team</strong></p>\n<p>You will become part of our growing Data &amp; Analytics teams. In this area, you will work with modern technologies in modern data ecosystems. You have the opportunity to turn your own ideas into results - in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, and Analytics &amp; Data Science.</p>\n<p><strong>About Infosys Consulting</strong></p>\n<p>You will become an employee of a globally renowned management consulting firm at the forefront of technological innovation and industrial transformation. We work across industries with leading companies. Our culture is inclusive and entrepreneurial. As a mid-sized consulting firm embedded in the size of Infosys, we can support our customers worldwide and throughout the entire transformation process in a partnership-like manner.</p>\n<p>Our values IC-LIFE - Inclusion, Equity &amp; Diversity, Client, Leadership, Integrity, Fairness, and Excellence - form our compass of values. Further information can be found on our career website.</p>\n<p>In Europe, we are awarded by the Financial Times and Forbes as one of the leading consulting firms. Infosys is ranked among the top employers in Germany 2023 and has been certified by the Top Employers Institute for outstanding working conditions in Europe for five consecutive years.</p>\n<p>We offer a market-leading salary, attractive additional benefits, and excellent opportunities for further education and development. Have you become curious? Then we look forward to your application - apply now!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_aafa7b92-fa6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Infosys Consulting - Europe","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/ecAfMkjFkA97qaoimVMGNF/hybrid-(senior)-consultant---data-engineering-%26-data-science-(m%2Fw%2Fd)--deutschlandweit-in-munich-at-infosys-consulting---europe","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Data Science","Machine Learning","Deep Learning","GenAI","Data Engineering","Data Warehousing","Data Lakes","Lakehouses","Data Pipelines","Cloud-based Data Platforms","Azure","AWS","GCP","Databricks","Snowflake","BigQuery","Azure Synapse","Microsoft Fabric","CI/CD","Python","SQL","Relational Databases"],"x-skills-preferred":["Streaming Technologies","Time Series Analysis","NLP Applications","System Modeling","NoSQL Databases","Docker","Kubernetes","Data Visualization Tools","Cloud Certifications","Architecture Certifications"],"datePosted":"2026-03-09T16:55:58.580Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Munich, Bavaria, Germany"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Data Science, Machine Learning, Deep Learning, GenAI, Data Engineering, Data Warehousing, Data Lakes, Lakehouses, Data Pipelines, Cloud-based Data Platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse, Microsoft Fabric, CI/CD, Python, SQL, Relational Databases, Streaming Technologies, Time Series Analysis, NLP Applications, System Modeling, NoSQL Databases, Docker, Kubernetes, Data Visualization Tools, Cloud Certifications, Architecture Certifications"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ba5e5f71-701"},"title":"FBS Associate Analytics Engineer","description":"<p>FBS Associate Analytics Engineer</p>\n<p>We are seeking an FBS Associate Analytics Engineer to join our team. As an FBS Associate Analytics Engineer, you will play a key role in transforming raw data into structured, high-quality datasets that are ready for analysis. You will work on low to moderately complex business problems, receiving coaching and guidance from data leadership. Your primary focus will be on end-to-end data workflow, including data ingestion, transformation, modeling, and validation to enable data-driven decision-making across the organization.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Emerging data infrastructure development with coaching and guidance: Pipeline Design and Development – Architects and builds scalable data pipelines using modern ETL (Extract, Load, Transform) tools and frameworks such as DBT (Data Build Tool), Apache Airflow, or similar.</li>\n<li>Automates data ingestion processes from various sources including databases, APIs, and third party services.</li>\n<li>Data Storage and Management - Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery.</li>\n<li>Optimizes storage solutions for performance, cost efficiency, and scalability.</li>\n<li>Data Modeling - Develops and maintains logical and physical data models to support business analytics.</li>\n<li>Creates and manages dimensional models, star/snowflake schemas, and other data structures.</li>\n<li>Data Transformation - Transforms raw data into clean, organized, and analytics-ready datasets using SQL, Python, or other relevant languages.</li>\n<li>Data Quality Assurance - Conducts data validation and consistency checks to ensure the accuracy and reliability of data.</li>\n<li>Technology Stack - Utilizes modern data tools and technologies such as SQL, Python, dbt, Airflow, and cloud platforms like AWS, Azure, or GCP.</li>\n<li>Continuous Learning – Stays updated with the latest trends, best practices, and advancements in data engineering and analytics.</li>\n<li>Participates in professional development opportunities to enhance technical and analytical skills.</li>\n<li>Provides code as requirements for hardening and operationalization by technology with significant coaching, guidance, and feedback.</li>\n<li>Performs other duties as assigned.</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>1+ year of experience working on a Data Environment</li>\n<li>Good Analytics mindset</li>\n<li>Knowledge in SQL</li>\n<li>Strong verbal communication and listening skills.</li>\n<li>Demonstrated written communication skills.</li>\n<li>Demonstrated analytical skills.</li>\n<li>Demonstrated problem solving skills.</li>\n<li>Effective interpersonal skills.</li>\n<li>Seeks to acquire knowledge in area of specialty.</li>\n<li>Possesses strong technical aptitude. Basic experience with SQL or similar, dimensional modeling, pipeline orchestration, building data pipelines to transform data, and BI visualizations.</li>\n<li>Python experience is a plus</li>\n</ul>\n<p>Benefits</p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ba5e5f71-701","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/jaxxjRWH9XxkRbr1TCrPb5/remote-fbs-associate-analytics-engineer-in-mexico-at-capgemini","x-work-arrangement":"remote","x-experience-level":"entry","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Python","DBT","Apache Airflow","Snowflake","Redshift","BigQuery","Data Modeling","Data Transformation","Data Quality Assurance","Cloud Platforms"],"x-skills-preferred":["Python experience"],"datePosted":"2026-03-09T16:55:09.881Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, DBT, Apache Airflow, Snowflake, Redshift, BigQuery, Data Modeling, Data Transformation, Data Quality Assurance, Cloud Platforms, Python experience"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_631fb3fa-c57"},"title":"FBS Senior Data Engineer : Finance Domain","description":"<p>We&#39;re seeking a Senior Data Engineer with a strong P&amp;C insurance background to join our team. As a Senior Data Engineer, you will be responsible for analysing business data stories and translating them into technical requirements. You will design, build, test, and implement data products of various complexity with minimal guidance.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design, build, test, and implement data products of various complexity</li>\n<li>Analyse business data stories and translate them into technical requirements</li>\n<li>Work with finance data, banks, or insurance sector</li>\n<li>Use agile methodology, software development, and data development</li>\n<li>Use Python/PySpark, Shell Scripting, and Power BI</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Advanced skills in DBT, Snowflake, and SQL</li>\n<li>Experience in the financial industry</li>\n<li>Intermediate skills in Python/PySpark, Shell Scripting, and Power BI</li>\n<li>Agile methodology, software development, and data development</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive compensation and benefits package</li>\n<li>Comprehensive benefits package</li>\n<li>Career development and training opportunities</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Dynamic and inclusive work culture within a globally renowned group</li>\n<li>Private Health Insurance</li>\n<li>Pension Plan</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development</li>\n</ul>\n<p>Note: Benefits differ based on employee level.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_631fb3fa-c57","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/ek6Tt5quFeduFHWtqok7zx/hybrid-fbs-senior-data-engineer-%3A-finance-domain-in-pune-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["DBT","Snowflake","SQL","Python/PySpark","Shell Scripting","Power BI"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:54:54.478Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Pune, Maharashtra, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"DBT, Snowflake, SQL, Python/PySpark, Shell Scripting, Power BI"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2e1270db-bb7"},"title":"FBS Senior Data Engineer (Airflow)","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>We are looking for a Senior Data Engineer to join our new data platforms team, which will be our centralized shared services team supporting all data platforms such as Snowflake and Astronomer. The position is focused on Astronomer/Ariflow.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Build and maintain automated data workflows and orchestrations using Apache Airflow</li>\n<li>Implement at least two major end-to-end data pipeline projects using Airflow</li>\n<li>Design and optimize complex DAGs for scalability, maintainability, and reliability</li>\n<li>Create reusable, parameterized, and modular Airflow components (operators, sensors, hooks) to streamline workflow development</li>\n<li>Ensure effective monitoring, alerting, and logging of Airflow DAGs for quick issue resolution</li>\n<li>Document workflows, solutions, and processes for team knowledge sharing and training</li>\n<li>Mentor and support other team members in Airflow usage and adoption</li>\n<li>Explain best practices, identify pros and cons, and communicate technical decisions to team members</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Between 4-6 years of experience in a similar role</li>\n<li>Bachelor&#39;s degree in IT, Information systems, Computer Science or a related field</li>\n<li>Insurance Experience (Desirable)</li>\n<li>Fluency in English</li>\n<li>Availability to work according to CST or PST time zones.</li>\n</ul>\n<p><strong>Technical Skills</strong></p>\n<ul>\n<li>Airflow (MUST) / Astronomer (PLUS) - Advanced (5 Years)</li>\n<li>Python - Advanced (4-6 Years) (MUST)</li>\n<li>Snowflake – Intermediate (MUST)</li>\n<li>DBT - Entry Level (PLUS)</li>\n<li>AWS Glue - Entry Level (PLUS)</li>\n<li>DynamoDB - Intermediate</li>\n<li>Amazon RDS - Intermediate</li>\n<li>Jenkins - Intermediate</li>\n</ul>\n<p><strong>Other Critical Skills</strong></p>\n<ul>\n<li>Work Independently</li>\n<li>Strategic Thinking</li>\n<li>Guide Others</li>\n<li>Documentation</li>\n<li>Explain best practices</li>\n<li>Communicate Technical Decisions</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2e1270db-bb7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/g6Kk9PeaSt9vgqEz7dTtY5/remote-fbs-senior-data-engineer-(airflow)-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Airflow","Python","Snowflake","DBT","AWS Glue","DynamoDB","Amazon RDS","Jenkins"],"x-skills-preferred":["Insurance Experience","Fluency in English"],"datePosted":"2026-03-09T16:54:51.948Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Airflow, Python, Snowflake, DBT, AWS Glue, DynamoDB, Amazon RDS, Jenkins, Insurance Experience, Fluency in English"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_dcfed817-412"},"title":"FBS Senior Data Domain Architect","description":"<p>We&#39;re looking for a Senior Data Domain Architect to join our team. As a Senior Data Domain Architect, you will design and develop Data/Domain IT architecture solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>\n<p><strong>What to expect on your journey with us:</strong></p>\n<ul>\n<li>A solid and innovative company with a strong market presence</li>\n<li>A dynamic, diverse, and multicultural work environment</li>\n<li>Leaders with deep market knowledge and strategic vision</li>\n<li>Continuous learning and development</li>\n</ul>\n<p><strong>Key Responsibilities:</strong></p>\n<ul>\n<li>Utilize in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery</li>\n<li>Solve complex problems and partner effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization</li>\n<li>Work independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge</li>\n<li>Review high level design to ensure alignment to Solution Architecture</li>\n<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives</li>\n<li>Mentor developers and create reference implementations/frameworks</li>\n<li>Partner with System Architects to elaborate capabilities and features</li>\n<li>Deliver single domain architecture solutions and execute continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>Over 6 years of experience as a senior domain architect for Data domains</li>\n<li>Advanced English Level</li>\n<li>Masters&#39; degree (PLUS)</li>\n<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>\n</ul>\n<p><strong>Technical &amp; Business Skills:</strong></p>\n<ul>\n<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>\n<li>Data Architecture / Data Modeling – Advanced (MUST)</li>\n<li>Data Warehouse – Advanced (MUST)</li>\n<li>Cloud Data Platforms - Advanced</li>\n<li>Data Integration Tools – Advanced</li>\n<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>\n<li>Any Cloud - Intermediate (4-6 Years)</li>\n<li>Power BI or Tableau - Intermediate (4-6 Years)</li>\n<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>\n<li>Data Lakehouse – Intermediate (MUST)</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<ul>\n<li>A competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Private Health Insurance</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_dcfed817-412","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/x7tKXYFBB815ca6oBV5T2E/remote-fbs-senior-data-domain-architect-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["ETL/ELT Tools (Informatica, DBT)","Data Architecture / Data Modeling","Data Warehouse","Cloud Data Platforms","Data Integration Tools","Snowflake or Databricks","Any Cloud","Power BI or Tableau","Data Science tools (Sagemaker, Databricks)","Data Lakehouse"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:53:31.425Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ee2fcbdc-fc4"},"title":"Principal Consultant - Data Architecture","description":"<p><strong>Principal Consultant - Data Architecture</strong></p>\n<p>You will act as a senior technical leader in complex data and analytics engagements, shaping and governing end-to-end enterprise data architectures, leading technical teams, and serving as a trusted technical advisor for clients and internal stakeholders.</p>\n<p><strong>About Your Role</strong></p>\n<p>As a Principal Data Architecture Consultant, you will be responsible for ensuring that enterprise data and analytics solutions are scalable, secure, and production-ready, while translating business requirements into robust technical designs and delivery roadmaps.</p>\n<p><strong>Your Role Will Include:</strong></p>\n<ul>\n<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>\n<li>Translate business objectives into scalable, secure, and compliant data solutions</li>\n<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>\n<li>Guide delivery teams through implementation, rollout, and production readiness</li>\n<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>\n<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>\n<li>Support pre-sales and solution design activities from a technical perspective</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>\n<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>\n<li>Strong client-facing experience in complex enterprise environments</li>\n</ul>\n<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>\n<ul>\n<li>Strong expertise in modern data architectures, including:</li>\n<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>\n<li>Modern Data Architecture design principles</li>\n<li>Batch and streaming data integration patterns</li>\n<li>Data Platform, DevOps, deployment and security architectures</li>\n<li>Analytics and AI enablement architectures</li>\n<li>Hands-on experience with cloud data platforms, e.g.:</li>\n<li>Azure, AWS or GCP</li>\n<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>\n<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>\n<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>\n<li>Solid understanding of API-based and event-driven architectures</li>\n<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>\n</ul>\n<p><strong>Engineering &amp; Platform Foundations</strong></p>\n<ul>\n<li>Experience with data pipelines, orchestration, and automation</li>\n<li>Familiarity with CI/CD concepts and production-grade deployments</li>\n<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>\n</ul>\n<p><strong>Data Management &amp; Governance</strong></p>\n<ul>\n<li>Strong understanding of data management and governance principles, including:</li>\n<li>Data quality, metadata, lineage, master data management</li>\n<li>Data Management software and tools</li>\n<li>Security, access control, and compliance considerations</li>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>\n<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>\n<li>Hands-on Experience with data governance or metadata tools</li>\n<li>Cloud, data, or architecture certifications</li>\n</ul>\n<p><strong>Language &amp; Mobility</strong></p>\n<ul>\n<li>Very good English skills</li>\n<li>Willingness to travel for project-related work</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>\n<p><strong>About Infosys Consulting</strong></p>\n<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>\n<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>\n<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>\n<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ee2fcbdc-fc4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Infosys Consulting - Europe","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/uuSzzCt8qNbo6UpEFkSyjY/hybrid-principal-consultant---data-architecture-in-london-at-infosys-consulting---europe","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Data Mesh/ Data Fabric/ Data lake / data warehouse architectures","Modern Data Architecture design principles","Batch and streaming data integration patterns","Data Platform, DevOps, deployment and security architectures","Analytics and AI enablement architectures","Azure, AWS or GCP","Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric","Postgres, SQL Server, Oracle","Cosmos DB, MongoDB, InfluxDB","API-based and event-driven architectures","Docker / Kubernetes"],"x-skills-preferred":["Advanced analytics, AI / ML or GenAI","Streaming platforms (e.g. Kafka, Azure Event Hubs)","Data governance or metadata tools","Cloud, data, or architecture certifications"],"datePosted":"2026-03-09T16:52:06.783Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, Azure, AWS or GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, Postgres, SQL Server, Oracle, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, Docker / Kubernetes, Advanced analytics, AI / ML or GenAI, Streaming platforms (e.g. Kafka, Azure Event Hubs), Data governance or metadata tools, Cloud, data, or architecture certifications"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c818f438-851"},"title":"FBS CX Data Analyst","description":"<p>Our Client is seeking a Customer Experience (CX) Data Analyst to work at the intersection of data, customer behavior, and business strategy. In this role, you will analyze customer data to uncover trends, behaviors, and opportunities to enhance customer experience. You will develop dashboards and reports to support CX performance tracking and strategic initiatives. You will use SQL, Python, and BI tools to extract, manipulate, and analyze structured data. You will apply statistical, financial, and forecasting models to support business planning and performance optimization. You will partner with cross-functional teams to translate business questions into analytical frameworks. You will ensure data accuracy, integrity, and consistency across multiple data sources. You will support AI-driven and advanced analytics initiatives to improve customer insights. You will continuously improve reporting processes and data platforms to meet evolving business needs.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Analyze customer data to uncover trends, behaviors, and opportunities to enhance customer experience</li>\n<li>Develop dashboards and reports to support CX performance tracking and strategic initiatives</li>\n<li>Use SQL, Python, and BI tools to extract, manipulate, and analyze structured data</li>\n<li>Apply statistical, financial, and forecasting models to support business planning and performance optimization</li>\n<li>Partner with cross-functional teams to translate business questions into analytical frameworks</li>\n<li>Ensure data accuracy, integrity, and consistency across multiple data sources</li>\n<li>Support AI-driven and advanced analytics initiatives to improve customer insights</li>\n<li>Continuously improve reporting processes and data platforms to meet evolving business needs</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>5+ years of experience in Data Analytics, Business Intelligence, or Customer Analytics roles</li>\n<li>Bachelor’s degree in Statistics, Mathematics, Business Analytics, Data Engineering, Data Science, or related field</li>\n<li>Advanced English proficiency</li>\n<li>Analytics – Intermediate</li>\n<li>Statistical Modeling – Intermediate</li>\n<li>Financial Modeling – Intermediate</li>\n<li>Forecasting – Intermediate</li>\n<li>SQL – Intermediate</li>\n<li>Python – Intermediate</li>\n<li>Excel – Intermediate</li>\n<li>Power BI – Entry Level</li>\n<li>Snowflake – Entry Level</li>\n<li>Microsoft 365 &amp; Copilot – Entry Level</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c818f438-851","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/vRDgqUEKbzbmw8fG9oDkZ8/hybrid-fbs-cx-data-analyst-in-bogot%C3%A1-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Python","BI tools","Statistical modeling","Financial modeling","Forecasting","Data analysis","Data visualization","Business intelligence","Customer analytics"],"x-skills-preferred":["Power BI","Snowflake","Microsoft 365 & Copilot"],"datePosted":"2026-03-09T16:51:55.800Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bogotá, Bogota, Colombia"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, Python, BI tools, Statistical modeling, Financial modeling, Forecasting, Data analysis, Data visualization, Business intelligence, Customer analytics, Power BI, Snowflake, Microsoft 365 & Copilot"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9d674bed-4cf"},"title":"FBS Senior Data Engineer : Finance Domain","description":"<p>We seek a Data Engineer with a strong P&amp;C insurance background to analyze business data stories and translate them into technical requirements. The ideal candidate can independently design, build, test, and implement data products of various complexity with minimal guidance.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design, build, test, and implement data products of various complexity</li>\n<li>Analyze business data stories and translate them into technical requirements</li>\n<li>Work with finance data, banks, or insurance sector</li>\n<li>Collaborate with cross-functional teams to deliver data-driven solutions</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Must have skills:</li>\n<li>DBT - Advanced</li>\n<li>Snowflake - Intermediate</li>\n<li>SQL - Advanced</li>\n<li>Experience in Financial Industry is must</li>\n<li>Agile methodology, Software development, data development</li>\n<li>Python/ PySpark - Intermediate</li>\n<li>Shell Scripting - Intermediate</li>\n<li>Power BI - Intermediate</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive compensation and benefits package:</li>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Career development and training opportunities</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Dynamic and inclusive work culture within a globally renowned group</li>\n<li>Private Health Insurance</li>\n<li>Pension Plan</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development</li>\n</ul>\n<p><strong>About Capgemini</strong></p>\n<p>Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The company has a strong 55-year heritage and deep industry expertise.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9d674bed-4cf","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/u9sZtNc5Dn5iXcrSyEU92H/hybrid-fbs-senior-data-engineer-%3A-finance-domain-in-hyderabad-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["DBT","Snowflake","SQL","Python","PySpark","Shell Scripting","Power BI"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:51:51.894Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Hyderabad, Telangana, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"DBT, Snowflake, SQL, Python, PySpark, Shell Scripting, Power BI"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_56dc9a51-e66"},"title":"Principal Consultant - Data Architecture","description":"<p><strong>Principal Consultant - Data Architecture</strong></p>\n<p>You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>\n<p><strong>About Your Role</strong></p>\n<p>As a Principal Data Architecture Consultant, you will act as a senior technical leader in complex data and analytics engagements. You will shape and govern end-to-end enterprise data architectures, lead technical teams, and serve as a trusted technical advisor for clients and internal stakeholders.</p>\n<p><strong>Your Role Will Include:</strong></p>\n<ul>\n<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>\n<li>Translate business objectives into scalable, secure, and compliant data solutions</li>\n<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>\n<li>Guide delivery teams through implementation, rollout, and production readiness</li>\n<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>\n<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>\n<li>Support pre-sales and solution design activities from a technical perspective</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>\n<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>\n<li>Strong client-facing experience in complex enterprise environments</li>\n</ul>\n<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>\n<ul>\n<li>Strong expertise in modern data architectures, including:</li>\n<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>\n<li>Modern Data Architecture design principles</li>\n<li>Batch and streaming data integration patterns</li>\n<li>Data Platform, DevOps, deployment and security architectures</li>\n<li>Analytics and AI enablement architectures</li>\n<li>Hands-on experience with cloud data platforms, e.g.:</li>\n<li>Azure, AWS or GCP</li>\n<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>\n<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>\n<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>\n<li>Solid understanding of API-based and event-driven architectures</li>\n<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>\n</ul>\n<p><strong>Engineering &amp; Platform Foundations</strong></p>\n<ul>\n<li>Experience with data pipelines, orchestration, and automation</li>\n<li>Familiarity with CI/CD concepts and production-grade deployments</li>\n<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>\n</ul>\n<p><strong>Data Management &amp; Governance</strong></p>\n<ul>\n<li>Strong understanding of data management and governance principles, including:</li>\n<li>Data quality, metadata, lineage, master data management</li>\n<li>Data Management software and tools</li>\n<li>Security, access control, and compliance considerations</li>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>\n<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>\n<li>Hands-on Experience with data governance or metadata tools</li>\n<li>Cloud, data, or architecture certifications</li>\n</ul>\n<p><strong>Language &amp; Mobility</strong></p>\n<ul>\n<li>Very good English skills</li>\n<li>Willingness to travel for project-related work</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>You will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>\n<p><strong>About Infosys Consulting</strong></p>\n<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>\n<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>\n<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>\n<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_56dc9a51-e66","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Infosys Consulting - Europe","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/hpBWjvvy8D6B1f818cHxZR/remote-principal-consultant---data-architecture-in-poland-at-infosys-consulting---europe","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["enterprise data architecture","system data integration","data engineering","analytics","modern data architectures","Data Mesh/ Data Fabric/ Data lake / data warehouse architectures","Modern Data Architecture design principles","Batch and streaming data integration patterns","Data Platform, DevOps, deployment and security architectures","Analytics and AI enablement architectures","cloud data platforms","Azure","AWS","GCP","Databricks","Snowflake","BigQuery","Azure Synapse / Microsoft Fabric","SQL","relational databases","Postgres","SQL Server","Oracle","NoSQL databases","Cosmos DB","MongoDB","InfluxDB","API-based and event-driven architectures","data migration programmes","data pipelines","orchestration","automation","CI/CD concepts","production-grade deployments","distributed systems","Docker","Kubernetes","data management and governance principles","data quality","metadata","lineage","master data management","data management software and tools","security","access control","compliance considerations","Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience"],"x-skills-preferred":["advanced analytics","AI / ML or GenAI","streaming platforms","Kafka","Azure Event Hubs","data governance or metadata tools","cloud","data","architecture certifications"],"datePosted":"2026-03-09T16:51:22.857Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Poland"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"enterprise data architecture, system data integration, data engineering, analytics, modern data architectures, Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, cloud data platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, SQL, relational databases, Postgres, SQL Server, Oracle, NoSQL databases, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, data migration programmes, data pipelines, orchestration, automation, CI/CD concepts, production-grade deployments, distributed systems, Docker, Kubernetes, data management and governance principles, data quality, metadata, lineage, master data management, data management software and tools, security, access control, compliance considerations, Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience, advanced analytics, AI / ML or GenAI, streaming platforms, Kafka, Azure Event Hubs, data governance or metadata tools, cloud, data, architecture certifications"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a4431994-e0e"},"title":"FBS Sr Data Engineer (Insurance Experience)","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>We seek Data Engineer with a strong P&amp;C insurance background to analyse business data stories and translate them into technical requirements. The ideal candidate can independently design, build, test, and implement data products of various complexity with minimal guidance.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Acquire, curate, and publish data for analytical or operational uses</li>\n<li>Ensure data is in a ready-to-use form that creates a single version of the truth across all data consumers</li>\n<li>Translate business analytic requests/requirements into design, development, testing, deployment, and production maintenance tasks</li>\n<li>Work independently with various technologies from big data, relational and non-relational databases, cloud environments, different programming languages, and various reporting tools</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>4-6 years of experience as a Data Engineer with ETL using SQL</li>\n<li>At least 2 years Insurance Background – Ideally P&amp;C, flexible for insurance in other areas Health, Life if upskilling is possible</li>\n<li>Advanced SQL skills</li>\n<li>Advanced DBT/ Informatica skills</li>\n<li>Intermediate Snowflake skills</li>\n<li>Intermediate Power BI skills</li>\n<li>Intermediate Python/ PySpark skills</li>\n<li>Intermediate Shell Scripting skills</li>\n<li>Strong communication and problem-solving skills</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<ul>\n<li>Competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Private Health Insurance</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a4431994-e0e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/i4AFDZ5mxG6cP52mJ3GHJk/remote-fbs-sr-data-engineer-(insurance-experience)-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","DBT/ Informatica","Snowflake","Power BI","Python/ PySpark","Shell Scripting"],"x-skills-preferred":["Agile methodology","Software development","data development","Commercial / Business Insurance"],"datePosted":"2026-03-09T16:51:08.784Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, DBT/ Informatica, Snowflake, Power BI, Python/ PySpark, Shell Scripting, Agile methodology, Software development, data development, Commercial / Business Insurance"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b2fcfe0b-0dd"},"title":"FBS AWS Data Engineer","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. This position works on data projects of intermediate complexity to lead in the design, development, and implementation of data products.</p>\n<p>Key Responsibilities\n• Prep and cleanse data to optimize for downstream reporting via Farmers standard visualization or AI/ML tools with coaching and feedback\n• Translate business data stories into a technical story breakdown structure and work estimates for a schedule or planned agile sprint\n• Develop and maintain moderately complex scalable data pipelines for both streaming and batch requirements and build out new API integrations to support increased demands of data volume and complexity\n• Produce data building blocks, data models, and data flows for varying client requests such as dimensional data, standard and ad hoc reporting, data feeds, dashboard reporting, and data science research and exploration\n• Create business user access methods to structured and unstructured data. Utilize techniques such as mapping data to a common data model, natural language processing, transforming data as necessary to satisfy business rules, AI, statistical computations, and validation\n• Responsible for acquiring, curating, and publishing data both on prem and in the cloud for analytical or operational uses for basic to moderate scenarios\n• Ensure the data is in a ready-to-use form that creates a single version of the truth across all data consumers, including business/technology users, reporting and visualization specialists and data scientists with coaching and support\n• Utilize skills to translate business analytic requests/requirements into design, development, testing, deployment, and production maintenance tasks\n• Works with various technologies from big data, relational and non-relational databases, cloud environments, different programming languages and various reporting tools and is familiar with a few but requires training for some</p>\n<p>Requirements\n• 4-6 years of experience in a similar as a Data Engineer with AWS Tools\n• BS in Computer Science or similar\n• Full English Fluency\n• Exp Insurance within finance area (PLUS)</p>\n<p>Technical Experience\n• Python and SQL – Intermediate (MUST)\n• AWS tools such as AWS Glue, S3, AWS Lambda, Iceberg and Lake Formation (MUST)\n• Snowflake - Intermediate (4-6 Years) (MUST)\n• DBT - Entry Level (1-3 Years) (MUST)\n• AWS Cloud Data - Intermediate (4-6 Years) (MUST)\n• MSSQL - Entry Level (1-3 Years) (Desirable)\n• Communications - Intermediate\n• Office Suite - Intermediate\n• Rally - Entry Level or similar\n• Agile - Entry Level, knowledge</p>\n<p>Benefits\nThis position comes with a competitive compensation and benefits package.\n• A competitive salary and performance-based bonuses.\n• Comprehensive benefits package.\n• Flexible work arrangements (remote and/or office-based).\n• You will also enjoy a dynamic and inclusive work culture within a globally renowned group.\n• Private Health Insurance.\n• Paid Time Off.\n• Training &amp; Development opportunities in partnership with renowned companies.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b2fcfe0b-0dd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/nog4LBbHddk4ZFvf6Bfqdh/remote-fbs-aws-data-engineer-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","SQL","AWS Glue","S3","AWS Lambda","Iceberg","Lake Formation","Snowflake","DBT","AWS Cloud Data","MSSQL","Communications","Office Suite","Rally","Agile"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:50:42.993Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, SQL, AWS Glue, S3, AWS Lambda, Iceberg, Lake Formation, Snowflake, DBT, AWS Cloud Data, MSSQL, Communications, Office Suite, Rally, Agile"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_fbb19758-f83"},"title":"Principal Consultant Data Architecture (m/w/d)","description":"<p>Are you looking to advance your career and work with experienced, talented colleagues to successfully solve the most significant challenges of our clients? We are growing further and seeking engaged individuals to strengthen our team. You will be part of a dynamic, strongly growing company with over 300,000 employees.</p>\n<p>Our dynamic organisation allows you to work across themes and bring in your ideas, experiences, creativity, and goal orientation. Are you ready?</p>\n<p>As a Principal Consultant Data Architecture, you will be the technical leader in complex data and analytics projects. You will design and be responsible for comprehensive enterprise data architectures, lead technical teams, and be a trusted technical advisor for customers and internal stakeholders.</p>\n<p>You will ensure that enterprise data and analytics solutions are scalable, secure, and operational, translate technical requirements into robust technical images, and plan the introduction.</p>\n<p><strong>Your Tasks:</strong></p>\n<ul>\n<li>Definition and governance of target architectures for enterprise data, integration, and analytics in cloud and hybrid environments</li>\n<li>Translation of business goals into scalable, secure, and compliant architectures</li>\n<li>Leadership of the conception of comprehensive end-to-end data solutions (data intake, data integration, storage, security, processing, analytics, AI support)</li>\n<li>Steering and accompanying delivery teams during implementation, rollout, and establishment of operational readiness</li>\n<li>Senior technical contact person for architects, IT managers, and technical teams of customers</li>\n<li>Mentoring of system and data architects as well as programmers</li>\n<li>Participation in the further development of best practices and reference architectures</li>\n<li>Support of presales and solution design activities from a technical perspective</li>\n</ul>\n<p><strong>What You Bring - Minimum Requirements</strong></p>\n<p><strong>Experience &amp; Seniority</strong></p>\n<ul>\n<li>At least 5 years of relevant professional experience in enterprise data architecture, data integration, data engineering, or analytics</li>\n<li>Experience in leading enterprise data architecture workstreams or technical teams</li>\n<li>Strong customer and advisory experience in complex enterprise environments</li>\n</ul>\n<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>\n<ul>\n<li>In-depth expertise in modern data architectures, particularly:</li>\n</ul>\n<ol>\n<li>Data Mesh / Data Fabric / Data Lake / Data Warehouse Architectures</li>\n<li>Principles of modern data architecture designs</li>\n<li>Integration patterns for batch and streaming data</li>\n<li>Data platform, DevOps, deployment, and security architectures</li>\n<li>Analytics and AI enablement architectures</li>\n</ol>\n<ul>\n<li>Practical experience with cloud data platforms, such as:</li>\n</ul>\n<ol>\n<li>Azure, AWS, or GCP</li>\n<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>\n</ol>\n<ul>\n<li>Very good SQL knowledge as well as experience with relational databases (e.g. PostgreSQL, SQL-Server, Oracle)</li>\n<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>\n<li>Good understanding of API-based and event-driven architectures</li>\n<li>Experience in conceiving and steering enterprise data migration programs (including mapping, transformation rules, data quality measures, etc.)</li>\n</ul>\n<p><strong>Engineering &amp; Platform Fundamentals</strong></p>\n<ul>\n<li>Experience with data pipelines, orchestration, and automation</li>\n<li>Knowledge of CI/CD concepts and production-ready deployments</li>\n<li>Understanding of distributed systems; Docker / Kubernetes knowledge is an advantage</li>\n</ul>\n<p><strong>Data Management &amp; Governance</strong></p>\n<ul>\n<li>Very good understanding of data management and governance principles, particularly:</li>\n</ul>\n<ol>\n<li>Data quality, metadata, lineage, master data management</li>\n<li>Data management software and tools</li>\n<li>Security, access, and compliance requirements</li>\n</ol>\n<ul>\n<li>Bachelor&#39;s or master&#39;s degree in computer science, engineering, mathematics, or a related field, or equivalent practical experience</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Experience with advanced analytics, AI/ML, or GenAI from an architect&#39;s perspective</li>\n<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>\n<li>Practical experience with data governance or metadata tools</li>\n<li>Cloud or architecture certifications</li>\n</ul>\n<p><strong>Language &amp; Mobility (Germany)</strong></p>\n<ul>\n<li>Fluent German skills (at least C1) for customer communication in the German-speaking market</li>\n<li>Very good English skills</li>\n<li>Project-related travel readiness</li>\n</ul>\n<p><strong>About Your Team</strong></p>\n<p>You will become part of our growing data and analytics teams. In this area, you will work with modern technologies in modern data ecosystems. You have the opportunity to turn your own ideas into results - in the areas of data and analytics strategy, data management and governance, data platforms and engineering, as well as analytics and data science.</p>\n<p><strong>About Infosys Consulting</strong></p>\n<p>You will become an employee of a globally renowned management consulting firm that is at the forefront of industry disruption. We work across industries with leading companies. Our culture is inclusive and entrepreneurial. As a mid-sized consulting firm embedded in the size of Infosys, we can support our customers worldwide and throughout the entire transformation process in a partnership-like manner.</p>\n<p>Our values IC-LIFE - Inclusion, Equity &amp; Diversity, Client, Leadership, Integrity, Fairness, and Excellence - form our compass of values. Further information can be found on our career website.</p>\n<p>In Europe, we are awarded by the Financial Times and Forbes as one of the leading consulting firms. Infosys is one of the top employers in Germany 2023 and has been certified by the Top Employers Institute for outstanding working conditions in Europe for five years in a row.</p>\n<p>We offer a market-leading remuneration, attractive additional benefits, as well as excellent further education and development opportunities. Have you become curious? Then we look forward to your application</p>\n<p>More about Infosys Consulting - Europe</p>\n<p><strong>Visit website</strong></p>\n<p>Where Innovation meets Excellence.</p>\n<p>Infosys Consulting is a globally renowned management consulting firm that is on the front-line of industry disruption. We are a mid-size player with a supportive, entrepreneurial spirit that works with a market-leading brand in every sector, while our parent organization Infosys is a top-5 powerhouse IT brand that is outperforming the market and experiencing rapid growth.</p>\n<p>Our consulting business is annually recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths we offer to our consultants. We are committed to fostering an inclusive work culture that inspires everyone to deliver their best.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_fbb19758-f83","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Infosys Consulting - Europe","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/sve4gTuNFLf3RtEjhQMzHp/remote-principal-consultant-data-architecture-(m%2Fw%2Fd)--deutschlandweit-in-munich-at-infosys-consulting---europe","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Data Mesh","Data Fabric","Data Lake","Data Warehouse Architectures","Principles of modern data architecture designs","Integration patterns for batch and streaming data","Data platform, DevOps, deployment, and security architectures","Analytics and AI enablement architectures","Azure","AWS","GCP","Databricks","Snowflake","BigQuery","Azure Synapse / Microsoft Fabric","PostgreSQL","SQL-Server","Oracle","Cosmos DB","MongoDB","InfluxDB","API-based and event-driven architectures","Enterprise data migration programs"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:50:38.864Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Data Mesh, Data Fabric, Data Lake, Data Warehouse Architectures, Principles of modern data architecture designs, Integration patterns for batch and streaming data, Data platform, DevOps, deployment, and security architectures, Analytics and AI enablement architectures, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, PostgreSQL, SQL-Server, Oracle, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, Enterprise data migration programs"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_89cf8655-bba"},"title":"Sr Business Solutions Analyst - Data Product Owner","description":"<p>Develop complex value-based analytics solutions for quality, business process functionality, or other strategic/data/analytic initiatives that are complex in nature.</p>\n<p>Collaborate with cross-functional teams to identify and establish analytics requirements.</p>\n<p>Create detailed documentation and business plans that address stakeholder needs.</p>\n<p>Liaise between IT and Product Management for data solutions.</p>\n<p>Trusted business partner working with business leaders and cross-functional teams to identify key challenges and work on resolutions with the goal of improving organisational performance.</p>\n<p>Use advanced understanding of data elements, structures, standards typically involved in financial services transactions, reporting, and cloud-based data storage/sharing to adopt and apply standards in a way that satisfies and balances regulatory and compliance requirements with business needs.</p>\n<p>Develop and implement complex enterprise-wide data integration strategies that utilise various data-marts and data warehouse applications to enhance reporting.</p>\n<p>Continuously work to improve data platforms, dashboards, and reports to meet emerging and changing business demands and guidelines.</p>\n<p>Consult with minimal guidance from Principal on UAT, technical specifications, strategic planning, contract design, technical support, testing updates, and trainings.</p>\n<p>Manage data from multiple sources and use project management skills to complete complex projects with minimal coaching and guidance.</p>\n<p>Transmit data and proactively work to ensure data quality.</p>\n<p>Troubleshoot data issues and devise creative and effective ways to avoid or mitigate issues.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_89cf8655-bba","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/n5t7HfGHJjo9guNgu39F28/hybrid-sr-business-solutions-analyst---data-product-owner-in-pune-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Data Warehousing","Data Architecture & Design","Auto & Home Insurance","SQL","Snowflake","Informatica","Power BI"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:49:09.849Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Pune, Maharashtra, India"}},"employmentType":"FULL_TIME","occupationalCategory":"IT","industry":"Technology","skills":"Data Warehousing, Data Architecture & Design, Auto & Home Insurance, SQL, Snowflake, Informatica, Power BI"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_93e8d013-2c1"},"title":"Sr Business Solutions Analyst - Data Product Owner","description":"<p>Develop complex value-based analytics solutions for quality, business process functionality, or other strategic/data/analytic initiatives. Collaborate with cross-functional teams to identify and establish analytics requirements. Create detailed documentation and business plans that address stakeholder needs. Liaise between IT and Product Management for data solutions. Work with business leaders and cross-functional teams to identify key challenges and work on resolutions to improve organisational performance. Use advanced understanding of data elements, structures, standards typically involved in financial services transactions, reporting, and cloud-based data storage/sharing to adopt and apply standards in a way that satisfies and balances regulatory and compliance requirements with business needs. Develop and implement complex enterprise-wide data integration strategies that utilise various data-marts and data warehouse applications to enhance reporting. Continuously work to improve data platforms, dashboards, and reports to meet emerging and changing business demands and guidelines. Consult with minimal guidance on UAT, technical specifications, strategic planning, contract design, technical support, testing updates, and trainings. Manage data from multiple sources and use project management skills to complete complex projects with minimal coaching and guidance. Transmit data and proactively work to ensure data quality. Troubleshoot data issues and devise creative and effective ways to avoid or mitigate issues.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Developing complex value-based analytics solutions</li>\n<li>Collaborating with cross-functional teams to identify and establish analytics requirements</li>\n<li>Creating detailed documentation and business plans</li>\n<li>Liaising between IT and Product Management for data solutions</li>\n<li>Working with business leaders and cross-functional teams to identify key challenges and work on resolutions to improve organisational performance</li>\n<li>Developing and implementing complex enterprise-wide data integration strategies</li>\n<li>Continuously working to improve data platforms, dashboards, and reports</li>\n<li>Consulting with minimal guidance on various technical and business aspects</li>\n<li>Managing data from multiple sources and using project management skills to complete complex projects</li>\n<li>Troubleshooting data issues and devising creative and effective ways to avoid or mitigate issues</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_93e8d013-2c1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/iUXtT6bRSaetL2aU9Hq8RU/hybrid-sr-business-solutions-analyst---data-product-owner-in-hyderabad-at-capgemini","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Data Warehousing","Data Architecture & Design","Auto & Home Insurance","SQL","Snowflake","Informatica","Power BI"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:45:53.397Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Hyderabad"}},"employmentType":"FULL_TIME","occupationalCategory":"IT","industry":"Technology","skills":"Data Warehousing, Data Architecture & Design, Auto & Home Insurance, SQL, Snowflake, Informatica, Power BI"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d92bf714-0ed"},"title":"Python Financial Model Engineer, Associate","description":"<p>About this role</p>\n<p>We are looking for a self-motivated software engineer to onboard models to our platform. Collaborate with modellers on the implementation and deployment of financial risk models. Understand client requirements and translate them into software engineering tasks.</p>\n<p>Our team</p>\n<ul>\n<li>Is passionate about technology and solving complex problems.</li>\n<li>Develops in Python, working with technologies like Pandas, Apache Arrow, Snowflake, Prefect, Docker, and Azure DevOps.</li>\n<li>Consists of technologists that unlock constant innovation.</li>\n<li>Constantly challenges the technology status quo and looks for ways to improve the platform.</li>\n</ul>\n<p>Key Responsibilities</p>\n<p>We expect the role to involve the following core responsibilities and would expect a successful candidate to be able to demonstrate skills or experience with the following (not in order of priority):</p>\n<ul>\n<li>Quickly learn the platform and act as a subject matter expert towards modelling teams and product analysts.</li>\n<li>Work with modellers and product analysts to understand the business and their requirements. Help implement those on our platform using engineering best practices.</li>\n<li>Facilitate technical design and code review sessions to ensure software meets functional and compatibility requirements, as well as high quality standards.</li>\n<li>Stay abreast of the latest developments in machine learning, quantitative finance, and technology to incorporate innovative solutions into our platform.</li>\n<li>Enhance the performance of existing models, ensuring they operate efficiently at scale.</li>\n<li>Implementation and maintenance of a standard data / technology deployment workflow to ensure that all deliverables/enhancements are delivered in a disciplined and robust manner.</li>\n<li>Ensure operational readiness of the product and meet customer commitments with regards to incident SLAs.</li>\n</ul>\n<p>Skillset</p>\n<ul>\n<li>Strong experience (3+ years) in Python is crucial</li>\n<li>Bachelor’s (BSc) or higher degree in Computer Science or related field</li>\n<li>Experience with Pandas, Apache Arrow, Snowflake, (Prefect is a plus)</li>\n<li>Good understanding of Object-Oriented Design principles</li>\n<li>Fluency with AI coding tools and the use of LLM in everyday development</li>\n<li>Good understanding of fundamental Algorithms and Data Structures</li>\n<li>Knowledge of Azure DevOps and git, CI/CD</li>\n<li>Good understanding of unit tests, integration and regression tests, and their importance</li>\n<li>An aptitude for designing data models and pipelines is a plus</li>\n<li>Ability to understand advanced mathematical and statistical methods and concepts</li>\n<li>Fluency in reading, writing and speaking English</li>\n</ul>\n<p>Personal Qualities</p>\n<ul>\n<li>Team player</li>\n<li>Problem-solving skills</li>\n<li>Critical and analytical thinking</li>\n<li>Technical curiosity</li>\n<li>Adaptable</li>\n</ul>\n<p>Our benefits</p>\n<p>To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment and tools designed to help you in building a sound financial future; access to education reimbursement; comprehensive resources to support your physical health and emotional well-being; family support programs; and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>\n<p>Our hybrid work model</p>\n<p>BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d92bf714-0ed","directApply":true,"hiringOrganization":{"@type":"Organization","name":"BlackRock","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/fj1ayC3FJpaeEtw176Tee2/python-financial-model-engineer%2C-associate-in-budapest-at-blackrock","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","Pandas","Apache Arrow","Snowflake","Prefect","Docker","Azure DevOps","Object-Oriented Design principles","AI coding tools","LLM","fundamental Algorithms and Data Structures","Azure DevOps and git","CI/CD","unit tests","integration and regression tests"],"x-skills-preferred":["advanced mathematical and statistical methods and concepts"],"datePosted":"2026-03-09T16:45:41.136Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Budapest, Hungary"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Python, Pandas, Apache Arrow, Snowflake, Prefect, Docker, Azure DevOps, Object-Oriented Design principles, AI coding tools, LLM, fundamental Algorithms and Data Structures, Azure DevOps and git, CI/CD, unit tests, integration and regression tests, advanced mathematical and statistical methods and concepts"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5093878d-5f9"},"title":"C# Application Engineer, Associate","description":"<p>At BlackRock, technology is central to our mission, and our team continues to drive innovation across the industry. We value curiosity, collaboration, and a willingness to experiment to tackle complex problems. As an Associate, Software Engineer on the Enterprise Data Solutions data team, you&#39;ll join an engineering pod responsible for third-party integrations, data feeds, APIs, and incorporating Preqin data into the BlackRock Enterprise Data Platform. Your work will focus on building robust, scalable systems that align with business objectives. You&#39;ll deliver high-quality solutions by applying strong data expertise, product insight, and effective communication skills. Collaboration with other engineers, product managers, and data owners will be key as you help design, develop, and launch new features and influence our technical strategy.</p>\n<p>Key responsibilities will include:</p>\n<ul>\n<li>Designing, implementing, and maintaining robust systems for data distribution to customers and third-party integrations.</li>\n<li>Collaborating closely with engineering teams across the organisation to ensure adoption of optimal technical solutions and raising development standards through knowledge sharing and best practice implementation.</li>\n<li>Actively contributing to technical discussions regarding new product directions, data modelling, and architectural decisions to ensure the technology platform remains scalable and adaptable.</li>\n</ul>\n<p>What we are looking for:</p>\n<ul>\n<li>4+ years’ experience in software engineering.</li>\n<li>Strong technical ability across the full stack: C#, Python, FastAPI, React and Typescript are a plus.</li>\n<li>Experience with PostgreSQL, MongoDB and other SQL and NoSQL databases (AWS Aurora, Azure Cosmos DB, MS SQL Server, Cassandra are a plus).</li>\n<li>Experience with Data Warehouse systems, particularly Snowflake (DataBricks is a plus).</li>\n<li>Experience of working within cloud provider services – Azure or AWS and utilization of infrastructure as code (Terraform).</li>\n<li>Familiarity with containerisation – Docker and Kubernetes.</li>\n<li>Excel plugin development experience is a plus.</li>\n<li>Excellent verbal and written communication and interpersonal skills.</li>\n<li>A “let’s do it” and “challenge accepted” attitude when faced with challenging tasks, and willingness to learn new technologies and ways of working.</li>\n</ul>\n<p>Our benefits include retirement investment, education reimbursement, comprehensive resources to support physical health and emotional well-being, family support programs, and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>\n<p>Our hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5093878d-5f9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"BlackRock","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/eqw6vaVybvYbGaFNw8hUeY/c%23%2C-application-engineer%2C-associate-in-london-at-blackrock","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["C#","Python","FastAPI","React","Typescript","PostgreSQL","MongoDB","AWS Aurora","Azure Cosmos DB","MS SQL Server","Cassandra","Snowflake","DataBricks","Terraform","Docker","Kubernetes","Excel plugin development"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:45:21.927Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"C#, Python, FastAPI, React, Typescript, PostgreSQL, MongoDB, AWS Aurora, Azure Cosmos DB, MS SQL Server, Cassandra, Snowflake, DataBricks, Terraform, Docker, Kubernetes, Excel plugin development"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_fb3aa416-fd0"},"title":"Software Development Lead - .NET / Snowflake","description":"<p>About this role</p>\n<p>In the Aladdin Product Engineering, Private Markets department, we are seeking a Development Lead to lead our Software Engineering team based in London and Belgrade. The team works on evolving our Private Market&#39;s Reporting Solutions specifically focusing on reporting data layer.</p>\n<p>Fully integrated into our Aladdin Engineering Team, you will be exposed to both the technical and functional layers of our most innovative products, while acquiring outstanding abilities in the fast-growing Alternative Investments in general. You will be part of an international and diverse environment, with a strong drive for technical innovation.</p>\n<p>About you:</p>\n<p>You have a strong understanding of data, distributed systems and Software Development Lifecycle in an agile organization. As a Development Manager your role will involve mentoring and guiding junior talent to deliver impactful outcomes for the business. You will also be hands on and as a Senior Developer your role consists of analysing, refining, implementing, and validating new features as well as maintaining and supporting the existing ecosystem of Reporting tools including our data warehouse.</p>\n<p>You will collaborate with other development and platform teams, business partners and QA team members in delivering high quality software.</p>\n<p>Being a member of Aladdin Engineering, you will be:</p>\n<ul>\n<li><p>Curious and eager to learn new things, with a healthy disrespect for the status quo.</p>\n</li>\n<li><p>Willing to embrace work outside of your comfort zone, and open to mentorship from others; you make mistakes but learn from them.</p>\n</li>\n<li><p>Passionate about technology, with personal ownership for the work you do.</p>\n</li>\n<li><p>Data-focused, with an eye for the details that matter to seek the problem.</p>\n</li>\n</ul>\n<p>What will you be doing?</p>\n<p>You are leading the team and building new features, from their conception up to their deployment in production. You handle aspects of a SaaS product, including production monitoring and incident resolution on the cloud platform. You are also contributing to the improvement of the team methodologies: continuous integration/continuous delivery, automated testing, standard processes&#39; definition. As an active member of the Alternative Engineering team, you are collaborating with different groups, full of hardworking, forward-thinking people with an outstanding innovation spirit.</p>\n<p>You have:</p>\n<ul>\n<li><p>Bachelor or Master in Computer Science, Mathematics, Engineering or related software engineering background</p>\n</li>\n<li><p>Experience in team and people management in an engineering environment</p>\n</li>\n<li><p>Deep expertise in MS SQL Server, including stored procedures, performance tuning, and data modelling</p>\n</li>\n<li><p>Familiarity with Snowflake and data pipeline concepts (ETL, batch vs. streaming)</p>\n</li>\n<li><p>Experience with C#, with .Net Framework, .Net Core (.Net)</p>\n</li>\n<li><p>Experience with Cloud based services, AWS/Azure/Google Cloud</p>\n</li>\n<li><p>Strong analytical and problem-solving skills; proactive approach with ability to balance multiple projects simultaneously</p>\n</li>\n<li><p>Curiosity about the functional part of the product, base knowledge about the Finance industry will be highly appreciated</p>\n</li>\n<li><p>Proficient English, both written and spoken</p>\n</li>\n</ul>\n<p>Our benefits</p>\n<p>To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment and tools designed to help you in building a sound financial future; access to education reimbursement; comprehensive resources to support your physical health and emotional well-being; family support programs; and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>\n<p>Our hybrid work model</p>\n<p>BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.</p>\n<p>About BlackRock</p>\n<p>At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_fb3aa416-fd0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"BlackRock","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/3i7CJ3MRvB6Z7rJrPgwKri/software-development-lead---.net-%2F-snowflake-in-london-at-blackrock","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["MS SQL Server","Snowflake","C#","Net Framework","Net Core","Cloud based services","AWS","Azure","Google Cloud","Software Development Lifecycle","Agile organization"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:44:05.054Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"MS SQL Server, Snowflake, C#, Net Framework, Net Core, Cloud based services, AWS, Azure, Google Cloud, Software Development Lifecycle, Agile organization"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_591c2b45-829"},"title":"Data Ops, Vice President","description":"<p><strong>About this role</strong></p>\n<p>We are looking for an innovative hands-on technologist to run Data Operations for one of the largest global FinTech&#39;s. This is a new role that will transform how we manage and process high quality data at scale and reflects our commitment to invest in an Enterprise Data Platform to unlock our data strategy for BlackRock and our Aladdin Client Community.</p>\n<p><strong>Key responsibilities</strong></p>\n<p>The ideal candidate will be a high-energy, technology and data driven individual who has a track record of leading and doing the day to day operations.</p>\n<ul>\n<li>Ensure on time high quality data delivery with a single pane of glass for data pipeline observability and support</li>\n<li>Live and breathe best practices of data ops such as culture, processes and technology</li>\n<li>Partner cross-functionally to enhance existing data sets, eliminating manual inputs and ensuring high quality, and onboarding new data sets</li>\n<li>Lead change while ensuring daily operational excellence, quality, and control</li>\n<li>Build and maintain deep alignment with key internal partners on ops tooling and engineering</li>\n<li>Foster an agile collaborative culture which is creative open, supportive, and dynamic</li>\n</ul>\n<p><strong>Knowledge and Experience</strong></p>\n<ul>\n<li>8+ years’ experience in hands-on data operations including data pipeline monitoring and engineering</li>\n<li>Technical expert including experience with data processing, orchestration (Airflow) data ingestion, cloud-based databases/warehousing (Snowflake) and business intelligence tools</li>\n<li>Champion the adoption of AI-enabled capabilities across Data Operations, including anomaly detection, predictive issue management, and automated operational insights to improve resilience, scale, and client experience.</li>\n<li>The ability to operate and monitor large data sets through the data lifecycle, including the tooling and observability required to ensure data quality and control at scale</li>\n<li>Experience implementing, monitoring, and operating data pipelines that are fast, scalable, reliable, and accurate</li>\n<li>Understanding of modern-day data highways, the associated challenges, and effective controls</li>\n<li>Passionate about data platforms, data quality and everything data</li>\n<li>Practical and detailed oriented operations leader</li>\n<li>Inquisitive leader who will bring new ideas that challenge the status quo</li>\n<li>Ability to navigate a large, highly matrixed organization</li>\n<li>Strong presence with clients</li>\n</ul>\n<p><strong>Our benefits</strong></p>\n<p>To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment and tools designed to help you in building a sound financial future; access to education reimbursement; comprehensive resources to support your physical health and emotional well-being; family support programs; and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>\n<p><strong>Our hybrid work model</strong></p>\n<p>BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_591c2b45-829","directApply":true,"hiringOrganization":{"@type":"Organization","name":"BlackRock","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/szVvonyUEuyEFVfTxyXWtS/data-ops%2C-vice-president-in-edinburgh-at-blackrock","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data operations","data pipeline monitoring","data ingestion","cloud-based databases","business intelligence tools","Airflow","Snowflake"],"x-skills-preferred":["AI-enabled capabilities","anomaly detection","predictive issue management","automated operational insights"],"datePosted":"2026-03-09T16:43:27.322Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Edinburgh, Scotland"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"data operations, data pipeline monitoring, data ingestion, cloud-based databases, business intelligence tools, Airflow, Snowflake, AI-enabled capabilities, anomaly detection, predictive issue management, automated operational insights"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ec8eeead-726"},"title":"Java Engineer, Aladdin Engineering, Associate","description":"<p><strong>About this role</strong></p>\n<p>At BlackRock, technology is the foundation of our business. As a Java Back-End Engineer, you&#39;ll lead by example — architecting, coding, and mentoring teams to build resilient systems that power our global post-trade operations. You&#39;ll design and deliver enterprise-scale software with a focus on reliability, performance, and clean engineering practices.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Design and develop robust, high-performance back-end systems using Java 11+ and the Spring Boot ecosystem.</li>\n<li>Lead design discussions, code reviews, and architecture sessions with a hands-on approach.</li>\n<li>Build and maintain microservices and event-driven systems to process and distribute large-scale financial data.</li>\n<li>Develop data integration and pipeline components that connect systems across Snowflake, SQL Server, and real-time streaming platforms.</li>\n<li>Implement and optimize Redis-based caching and data stores for low-latency access patterns.</li>\n<li>Champion best practices for code quality, testing, automation, and performance tuning.</li>\n<li>Collaborate cross-functionally to ensure technical solutions align with product goals and business outcomes.</li>\n</ul>\n<p><strong>Qualifications / Competencies</strong></p>\n<ul>\n<li>B.S./M.S. in Computer Science, Engineering, or related discipline.</li>\n<li>3+ years of professional experience in Java and object-oriented design.</li>\n<li>Strong knowledge of Spring Boot, REST APIs, and enterprise integration patterns.</li>\n<li>Deep expertise in SQL Server, including stored procedures, performance tuning, and data modeling.</li>\n<li>Experience with Redis for caching or data persistence.</li>\n<li>Hands-on exposure to Kafka or similar publish-subscribe systems for real-time event processing.</li>\n<li>Familiarity with Snowflake and data pipeline concepts (ETL, batch vs. streaming).</li>\n<li>Experience with Agile coding and general understanding of how LLMs are working</li>\n<li>Strong focus on clean architecture, maintainability, and production readiness.</li>\n</ul>\n<p><strong>Our benefits</strong></p>\n<p>To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment and tools designed to help you in building a sound financial future; access to education reimbursement; comprehensive resources to support your physical health and emotional well-being; family support programs; and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>\n<p><strong>Our hybrid work model</strong></p>\n<p>BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ec8eeead-726","directApply":true,"hiringOrganization":{"@type":"Organization","name":"BlackRock","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/3vLTpfkn1mYzZFEn6qtubs/java-engineer%2C-aladdin-engineering%2C-associate-in-edinburgh-at-blackrock","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Java","Spring Boot","SQL Server","Redis","Kafka","Snowflake","Agile coding"],"x-skills-preferred":["Kubernetes","Docker","cloud-native environments","observability tools","scripting experience in Python"],"datePosted":"2026-03-09T16:43:22.343Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Edinburgh, Scotland"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Java, Spring Boot, SQL Server, Redis, Kafka, Snowflake, Agile coding, Kubernetes, Docker, cloud-native environments, observability tools, scripting experience in Python"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_10d04dca-482"},"title":"Python Financial Engineering Platform Lead, Vice President","description":"<p>About this role</p>\n<p>BlackRock is seeking a self-motivated lead software engineer to spearhead the development of its Python financial engineering platform. The platform enables research, implementation, delivery, and execution of financial risk models for internal partners and external clients.</p>\n<p>What will you be doing?</p>\n<ul>\n<li>Lead platform development supporting both financial engineers and researchers.</li>\n<li>Facilitate technical design and code review sessions to ensure software meets functional and compatibility requirements, as well as high quality standards.</li>\n<li>Build widely used and reliable fundamental components as part of the platform, distributed as Python libraries.</li>\n<li>Stay abreast of the latest developments in machine learning, quantitative finance, and technology to incorporate innovative solutions into our platform.</li>\n</ul>\n<p>Key Responsibilities</p>\n<ul>\n<li>Quickly learn the platform and act as a subject matter expert towards modelling teams.</li>\n<li>Build high quality software that improves the user experience of the downstream modeller and developer.</li>\n<li>Enhance the performance of existing models, ensuring they operate efficiently at scale.</li>\n<li>Implementation and maintenance of a standard data / technology deployment workflow to ensure that all deliverables/enhancements are delivered in a disciplined and robust manner.</li>\n</ul>\n<p>Skillset</p>\n<ul>\n<li>Strong experience (5+ years) in Python is crucial.</li>\n<li>Bachelor&#39;s (BSc) or higher degree in Computer Science or equivalent field.</li>\n<li>Experience with Pandas, Apache Arrow, Snowflake, (Prefect is a plus).</li>\n<li>Good understanding of Object-Oriented Design principles.</li>\n<li>Good understanding of fundamental Algorithms and Data Structures.</li>\n<li>Knowledge of Azure DevOps and git, CI/CD.</li>\n<li>Good understanding of unit tests, integration and regression tests, and their importance.</li>\n</ul>\n<p>Personal Qualities</p>\n<ul>\n<li>Team player.</li>\n<li>Problem-solving skills.</li>\n<li>Critical and analytical thinking.</li>\n<li>Technical curiosity.</li>\n<li>Adaptable.</li>\n</ul>\n<p>Our benefits</p>\n<ul>\n<li>Retirement investment and tools designed to help you in building a sound financial future.</li>\n<li>Access to education reimbursement.</li>\n<li>Comprehensive resources to support your physical health and emotional well-being.</li>\n<li>Family support programs.</li>\n<li>Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</li>\n</ul>\n<p>Our hybrid work model</p>\n<p>BlackRock&#39;s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_10d04dca-482","directApply":true,"hiringOrganization":{"@type":"Organization","name":"BlackRock","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/2mb7rZ3FNbXbRf6dpxMmaf/python-financial-engineering-platform-lead%2C-vice-president-in-budapest-at-blackrock","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","Pandas","Apache Arrow","Snowflake","Azure DevOps","git","CI/CD","unit tests","integration and regression tests"],"x-skills-preferred":["Prefect"],"datePosted":"2026-03-09T16:42:39.478Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Budapest, Hungary"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Python, Pandas, Apache Arrow, Snowflake, Azure DevOps, git, CI/CD, unit tests, integration and regression tests, Prefect"}]}