{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/cloud-data-platforms"},"x-facet":{"type":"skill","slug":"cloud-data-platforms","display":"Cloud Data Platforms","count":10},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b68ff4cc-e74"},"title":"Data Engineer, Safeguards","description":"<p><strong>About the role</strong></p>\n<p>Anthropic is looking for a Data Engineer to join the Safeguards team and build the data foundations that keep our AI systems safe. The Safeguards team works to monitor models, prevent misuse, and ensure user well-being.</p>\n<p>You&#39;ll design and build the data pipelines, warehousing solutions, and analytical tooling that power our safety and trust efforts at scale. You&#39;ll work closely with engineers, data scientists, and policy teams to ensure the Safeguards organization has the data it needs to detect abuse patterns, measure the effectiveness of safety interventions, and make informed decisions about model behavior and enforcement.</p>\n<p>This is a high-impact role where your work will directly support Anthropic&#39;s mission to develop AI that is safe and beneficial.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Design, build, and maintain scalable data pipelines that support safety monitoring, abuse detection, and enforcement workflows</li>\n<li>Develop and optimize data models and warehousing solutions to enable efficient analysis of large-scale usage and safety data</li>\n<li>Build and maintain dashboards and reporting infrastructure that give Safeguards teams visibility into model behavior, misuse patterns, and enforcement outcomes</li>\n<li>Collaborate with engineers to integrate data from multiple sources , including model outputs, user reports, and automated classifiers , into a unified analytical layer</li>\n<li>Implement data quality frameworks, monitoring, and alerting to ensure the reliability of safety-critical data</li>\n<li>Partner with research teams to surface data insights that inform model improvements and safety interventions</li>\n<li>Develop self-service data tooling that enables stakeholders to explore safety data and generate reports independently</li>\n<li>Contribute to data governance practices, including access controls, retention policies, and privacy-compliant data handling</li>\n</ul>\n<p><strong>You may be a good fit if you:</strong></p>\n<ul>\n<li>Have 3+ years of experience in data engineering, analytics engineering, or a related role</li>\n<li>Are proficient in SQL and Python, with experience building and maintaining ETL/ELT pipelines</li>\n<li>Have hands-on experience with modern data stack tools such as dbt, Airflow, Spark, or similar orchestration and transformation frameworks</li>\n<li>Have worked with cloud data platforms (BigQuery, Redshift, Snowflake, or similar)</li>\n<li>Are comfortable building dashboards and data visualizations using tools like Looker, Tableau, or Metabase</li>\n<li>Communicate clearly and can translate complex data concepts for both technical and non-technical audiences</li>\n<li>Are results-oriented, flexible, and willing to pick up slack even when it falls outside your job description</li>\n<li>Care about the societal impacts of AI and are motivated by safety work</li>\n</ul>\n<p><strong>Strong candidates may have:</strong></p>\n<ul>\n<li>Experience with trust &amp; safety, integrity, fraud, or abuse detection data systems</li>\n<li>Experience with large-scale event streaming systems (Kafka, Pub/Sub, Kinesis)</li>\n<li>Built data infrastructure that supports ML model monitoring or evaluation</li>\n<li>A background in statistical analysis, or experience collaborating closely with data scientists</li>\n<li>Developed internal tooling or self-service analytics platforms</li>\n</ul>\n<p><strong>Strong candidates need not have:</strong></p>\n<ul>\n<li>A formal degree in Computer Science or a related field , we value practical experience and demonstrated ability over credentials</li>\n<li>Prior experience in AI or machine learning , you&#39;ll learn the domain-specific context on the job</li>\n<li>Previous experience at an AI safety or research organization</li>\n<li>Deep expertise across every tool listed above , familiarity with a subset and a willingness to learn is enough</li>\n</ul>\n<p><strong>Logistics</strong></p>\n<p>Minimum education: Bachelor’s degree or an equivalent combination of education, training, and/or experience Required field of study: A field relevant to the role as demonstrated through coursework, training, or professional experience Minimum years of experience: Years of experience required will correlate with the internal job level requirements for the position Location-based hybrid policy: Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices. Visa sponsorship: We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>\n<p><strong>How we&#39;re different</strong></p>\n<p>We believe that the highest-impact AI research will be big science. At Anthropic we work as a single cohesive team on just a few large-scale research efforts. And we value impact , advancing our long-term goals of steerable, trustworthy AI , rather than work on smaller and more specific puzzles. We view AI research as an empirical science, which has as much in common with physics and biology as with traditional efforts in computer science. We&#39;re an extremely collaborative group, and we host frequent research discussions to ensure that we are pursuing the highest-impact work at any given time. As such, we greatly value communication skills. The easiest way to understand our research directions is to read our recent research. This research continues many of the directions our team worked on prior to Anthropic, including: GPT-3, Circuit-Based Interpretability, Multimodal Neurons, Scaling Laws, AI &amp; Compute, Concrete Problems in AI Safety, and Learning from Human Preferences.</p>\n<p><strong>Come work with us!</strong></p>\n<p>Anthropic is a public benefit corporation headquartered in San Francisco. We offer competitive compensation and benefits, optional equity donation matching, generous vacation and parental leave, flexible working hours, and a lovely office space in which to collaborate with colleagues.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b68ff4cc-e74","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.com/","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5156057008","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"£170,000-£220,000 GBP","x-skills-required":["SQL","Python","ETL/ELT pipelines","dbt","Airflow","Spark","cloud data platforms","BigQuery","Redshift","Snowflake","Looker","Tableau","Metabase"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:59:33.960Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, UK"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, ETL/ELT pipelines, dbt, Airflow, Spark, cloud data platforms, BigQuery, Redshift, Snowflake, Looker, Tableau, Metabase","baseSalary":{"@type":"MonetaryAmount","currency":"GBP","value":{"@type":"QuantitativeValue","minValue":170000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_47807ca3-e36"},"title":"Strategic AI/BI Account Executive","description":"<p>We are seeking a Strategic AI/BI Account Executive to help enterprise customers transform how business users interact with data. This high-impact role sits within the AI Go-To-Market team and partners closely with Enterprise Account Executives to drive adoption of Databricks AI/BI and Genie in APJ.</p>\n<p>You will help organisations move beyond static dashboards to governed, conversational, AI-powered analytics at the centre of the convergence of business intelligence, data platforms, and generative AI. Enterprise analytics is rapidly evolving from dashboards and static reporting to conversational, AI-driven decision platforms. Databricks AI/BI and Genie empower business users to securely interact with governed data using natural language, transforming the data platform into a true decision platform.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Partner with Enterprise AEs to identify, qualify, and close AI/BI opportunities</li>\n<li>Engage C-level, analytics, and line-of-business leaders to modernise analytics strategies</li>\n<li>Displace or expand legacy BI platforms with AI-powered, governed analytics solutions</li>\n<li>Lead conversations around semantic governance, self-service analytics, and natural language data access</li>\n<li>Drive proof-of-value engagements and scale enterprise-wide adoption</li>\n<li>Align AI/BI initiatives to measurable business outcomes (productivity, speed to insight, revenue impact)</li>\n<li>Enable field teams and serve as a subject matter expert on modern analytics architectures</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>Enterprise sales experience in BI, analytics, data platforms, or AI/ML</li>\n<li>Strong understanding of modern analytics architectures and data governance</li>\n<li>Ability to sell to both technical and business stakeholders</li>\n<li>Executive presence and experience navigating complex buying cycles</li>\n<li>Passion for AI and the impact of GenAI on enterprise analytics</li>\n<li>Experience operating in a specialist or overlay sales model</li>\n<li>Ability to translate technical capabilities into clear business value</li>\n<li>7+ years of Enterprise Sales experience, exceeding quotas in larger accounts</li>\n</ul>\n<p>Preferred qualifications include:</p>\n<ul>\n<li>Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot</li>\n<li>Familiarity with semantic layers, metrics stores, or governed data models</li>\n<li>Understanding of lakehouse architectures and cloud data platforms</li>\n<li>Exposure to GenAI, natural language interfaces, or conversational applications</li>\n<li>Consulting or solution design experience in customer-facing roles</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_47807ca3-e36","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8441884002","x-work-arrangement":"onsite","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Enterprise sales experience in BI, analytics, data platforms, or AI/ML","Strong understanding of modern analytics architectures and data governance","Ability to sell to both technical and business stakeholders","Executive presence and experience navigating complex buying cycles","Passion for AI and the impact of GenAI on enterprise analytics"],"x-skills-preferred":["Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot","Familiarity with semantic layers, metrics stores, or governed data models","Understanding of lakehouse architectures and cloud data platforms","Exposure to GenAI, natural language interfaces, or conversational applications","Consulting or solution design experience in customer-facing roles"],"datePosted":"2026-04-18T15:52:23.856Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Singapore"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Enterprise sales experience in BI, analytics, data platforms, or AI/ML, Strong understanding of modern analytics architectures and data governance, Ability to sell to both technical and business stakeholders, Executive presence and experience navigating complex buying cycles, Passion for AI and the impact of GenAI on enterprise analytics, Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot, Familiarity with semantic layers, metrics stores, or governed data models, Understanding of lakehouse architectures and cloud data platforms, Exposure to GenAI, natural language interfaces, or conversational applications, Consulting or solution design experience in customer-facing roles"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_015afe59-9fd"},"title":"Data Analyst II","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>\n<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>\n<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Data at Brex</p>\n<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>\n<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>\n<p>What you’ll do</p>\n<p>As a Data Analyst II (DA), you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>\n<p>You will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses.</p>\n<p>This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our New York office.</p>\n<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>\n<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>\n<p>As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities</p>\n<p>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</p>\n<p>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</p>\n<p>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</p>\n<p>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</p>\n<p>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</p>\n<p>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</p>\n<p>Contribute to the automation of recurring analyses and reporting workflows using Python.</p>\n<p>Requirements</p>\n<p>3+ years of experience in data analytics or a related role in a professional setting.</p>\n<p>2+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</p>\n<p>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</p>\n<p>Experience with Python for data analysis, automation, or scripting.</p>\n<p>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</p>\n<p>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</p>\n<p>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</p>\n<p>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</p>\n<p>Bonus points</p>\n<p>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</p>\n<p>Familiarity with dbt for data modeling and transformation.</p>\n<p>Exposure to data pipeline orchestration tools (e.g., Airflow).</p>\n<p>Experience in fintech, financial services, or payments.</p>\n<p>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</p>\n<p>Compensation</p>\n<p>The expected salary range for this role is $93,600 - $117,000.</p>\n<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>\n<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_015afe59-9fd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8463702002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$93,600 - $117,000","x-skills-required":["SQL","Python","Business Intelligence","Data Visualization","Generative AI","LLM-based tools"],"x-skills-preferred":["Cloud data platforms","dbt","Data pipeline orchestration tools","Fintech","Financial services","Payments"],"datePosted":"2026-04-18T15:50:50.572Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, New York, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, Financial services, Payments","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":93600,"maxValue":117000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3d22e39a-bde"},"title":"Data Analyst II","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>\n<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>\n<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Data at Brex</p>\n<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>\n<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>\n<p>What you’ll do</p>\n<p>As a Data Analyst II (DA), you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>\n<p>You will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses.</p>\n<p>This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our San Francisco office.</p>\n<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>\n<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>\n<p>As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities</p>\n<p>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</p>\n<p>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</p>\n<p>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</p>\n<p>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</p>\n<p>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</p>\n<p>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</p>\n<p>Contribute to the automation of recurring analyses and reporting workflows using Python.</p>\n<p>Requirements</p>\n<p>3+ years of experience in data analytics or a related role in a professional setting.</p>\n<p>2+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</p>\n<p>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</p>\n<p>Experience with Python for data analysis, automation, or scripting.</p>\n<p>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</p>\n<p>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</p>\n<p>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</p>\n<p>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</p>\n<p>Bonus points</p>\n<p>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</p>\n<p>Familiarity with dbt for data modeling and transformation.</p>\n<p>Exposure to data pipeline orchestration tools (e.g., Airflow).</p>\n<p>Experience in fintech, financial services, or payments.</p>\n<p>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</p>\n<p>Compensation</p>\n<p>The expected salary range for this role is $93,600 - $117,000.</p>\n<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>\n<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3d22e39a-bde","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8463696002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$93,600 - $117,000","x-skills-required":["SQL","Python","Business Intelligence","Data Visualization","Generative AI","LLM-based tools"],"x-skills-preferred":["Cloud data platforms","dbt","Data pipeline orchestration tools","Fintech","Financial services","Payments"],"datePosted":"2026-04-18T15:44:50.317Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, Financial services, Payments","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":93600,"maxValue":117000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7f904cf7-7bd"},"title":"Data Analyst II","description":"<p>Join us at Brex, the intelligent finance platform that empowers companies to spend smarter and move faster in over 200 markets. As a Data Analyst II, you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>\n<p>As a member of our Data organization, you will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses. This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</li>\n<li>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</li>\n<li>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</li>\n<li>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</li>\n<li>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</li>\n<li>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</li>\n<li>Contribute to the automation of recurring analyses and reporting workflows using Python.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>4+ years of experience in data analytics or a related role in a professional setting.</li>\n<li>3+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</li>\n<li>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</li>\n<li>Proficiency in Python for data analysis, automation, and scripting (Pandas, NumPy, and similar libraries).</li>\n<li>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</li>\n<li>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</li>\n<li>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</li>\n<li>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</li>\n</ul>\n<p>Bonus points:</p>\n<ul>\n<li>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</li>\n<li>Familiarity with dbt for data modeling and transformation.</li>\n<li>Exposure to data pipeline orchestration tools (e.g., Airflow).</li>\n<li>Experience in fintech, financial services, or payments.</li>\n<li>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7f904cf7-7bd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex LLC","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8463703002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Python","Business Intelligence","Data Visualization","Generative AI","LLM-based tools"],"x-skills-preferred":["Cloud data platforms","dbt","Data pipeline orchestration tools","Fintech, financial services, or payments"],"datePosted":"2026-04-18T15:39:28.984Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"São Paulo, São Paulo, Brazil"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Finance","skills":"SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, financial services, or payments"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_30da0df8-cc9"},"title":"F&S COE Analyst","description":"<p>We are seeking an analytically-driven Data Analyst to join our Finance &amp; Strategy team at Stripe. This role bridges the gap between data science and financial planning, requiring someone who can transform complex business data into actionable financial insights.</p>\n<p>You will build sophisticated dashboards, develop predictive models, and serve as the technical backbone for our FP&amp;A and GTM analytics initiatives. This is a unique opportunity for a data professional with financial acumen to directly influence strategic business decisions in a high-growth fintech environment.</p>\n<p><strong>Financial Data Analytics &amp; Modeling</strong></p>\n<ul>\n<li>Design, build, and maintain financial dashboards for FP&amp;A, Revenue Operations, and GTM teams using Tableau, Power BI, or Looker</li>\n<li>Develop automated financial reporting solutions that reduce manual effort and improve data accuracy</li>\n<li>Create sophisticated data models to support budgeting, forecasting, variance analysis, and scenario planning</li>\n<li>Build predictive models for revenue forecasting, customer lifetime value, churn analysis, and unit economics</li>\n</ul>\n<p><strong>Business Intelligence &amp; Reporting</strong></p>\n<ul>\n<li>Partner with Finance Business Partners and FP&amp;A teams to translate business requirements into technical solutions</li>\n<li>Design and implement data infrastructure for financial planning cycles (monthly/quarterly reviews, annual budgets, long-range planning)</li>\n<li>Develop self-service analytics capabilities enabling finance teams to access real-time business insights</li>\n<li>Create executive dashboards tracking key financial and operational metrics (ARR, bookings, retention, CAC, LTV)</li>\n</ul>\n<p><strong>Data Engineering &amp; Analytics Infrastructure</strong></p>\n<ul>\n<li>Write complex SQL queries to extract, transform, and analyze large datasets from multiple source systems</li>\n<li>Build ETL pipelines to integrate financial data from ERP, CRM, billing, and data warehouse systems</li>\n<li>Ensure data quality, consistency, and governance across financial reporting systems</li>\n<li>Optimize database performance and data architecture for scalability</li>\n</ul>\n<p><strong>Strategic Analysis &amp; Insights</strong></p>\n<ul>\n<li>Conduct deep-dive analyses on business performance, identifying trends, anomalies, and opportunities</li>\n<li>Support strategic initiatives through ad-hoc financial modeling and what-if scenario analysis</li>\n<li>Translate complex data findings into clear, actionable recommendations for leadership</li>\n<li>Collaborate with Data Science teams to develop advanced analytics and ML models for finance use cases</li>\n</ul>\n<p><strong>Required Qualifications</strong></p>\n<ul>\n<li>Advanced SQL proficiency (complex joins, window functions, CTEs, query optimization)</li>\n<li>Expert-level experience with at least one BI tool (Tableau, Power BI, Looker, or Qlik)</li>\n<li>Advanced Excel/Google Sheets skills (pivot tables, complex formulas, data modeling)</li>\n<li>Python or R for data analysis, automation, and statistical modeling</li>\n<li>Cloud data platforms (Snowflake, BigQuery, Redshift, Databricks)</li>\n<li>ETL tools (dbt, Airflow, Fivetran) and version control (Git)</li>\n</ul>\n<p><strong>Financial &amp; Business Acumen</strong></p>\n<ul>\n<li>Experience in data analytics within finance, FP&amp;A, or revenue operations functions</li>\n<li>Strong understanding of financial statements (P&amp;L, balance sheet, cash flow)</li>\n<li>Knowledge of key financial metrics: ARR, MRR, bookings, revenue recognition, CAC, LTV, gross margin, EBITDA</li>\n<li>Experience with financial planning processes: budgeting, forecasting, variance analysis, scenario modeling</li>\n<li>Understanding of SaaS/subscription business models and revenue recognition principles (ASC 606 preferred)</li>\n</ul>\n<p><strong>Analytical &amp; Problem-Solving</strong></p>\n<ul>\n<li>Proven ability to work with large, complex datasets and derive meaningful insights</li>\n<li>Experience building financial models and dashboards that drive executive decision-making</li>\n<li>Strong statistical analysis skills and understanding of data visualization best practices</li>\n<li>Track record of translating ambiguous business problems into structured analytical frameworks</li>\n</ul>\n<p><strong>Preferred Experience</strong></p>\n<ul>\n<li>Background in fintech, payments, B2B SaaS, or high-growth technology companies</li>\n<li>Experience supporting GTM analytics (sales forecasting, pipeline analysis, quota setting)</li>\n<li>Familiarity with finance systems: NetSuite, Anaplan, Adaptive Planning, Salesforce, Stripe Billing</li>\n<li>Exposure to data science methodologies and machine learning concepts</li>\n<li>Previous work in cross-functional environments collaborating with finance, data science, and business teams</li>\n</ul>\n<p><strong>Key Competencies</strong></p>\n<ul>\n<li>Business Acumen: Ability to understand complex business models and translate them into data requirements</li>\n<li>Technical Excellence: Deep technical skills with commitment to code quality and best practices</li>\n<li>Communication: Exceptional ability to explain technical concepts to non-technical stakeholders</li>\n<li>Stakeholder Management: Experience partnering with senior leaders and influencing through data</li>\n<li>Ownership Mindset: Self-directed with ability to manage multiple priorities and drive projects to completion</li>\n<li>Continuous Learning: Curiosity to learn new tools, techniques, and business domains</li>\n<li>Attention to Detail: Commitment to data accuracy and quality in high-stakes financial reporting</li>\n</ul>\n<p><strong>Education</strong></p>\n<ul>\n<li>Bachelor&#39;s degree in Finance, Economics, Statistics, Mathematics, Computer Science, Engineering, or related quantitative field</li>\n<li>Advanced degree (MBA, MS in Analytics/Data Science) or relevant certifications (CFA, CPA, data analytics certifications) a plus</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_30da0df8-cc9","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Stripe","sameAs":"https://stripe.com/","logo":"https://logos.yubhub.co/stripe.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/stripe/jobs/7597624","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Tableau","Power BI","Looker","Python","R","Cloud data platforms","ETL tools","Version control"],"x-skills-preferred":["Machine learning","Data science","Finance systems","Data visualization"],"datePosted":"2026-03-31T18:15:28.979Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bengaluru"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"SQL, Tableau, Power BI, Looker, Python, R, Cloud data platforms, ETL tools, Version control, Machine learning, Data science, Finance systems, Data visualization"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7af16166-8fd"},"title":"FBS Senior Data Domain Architect","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p><strong>What to expect on your journey with us:</strong></p>\n<ul>\n<li>A solid and innovative company with a strong market presence</li>\n<li>A dynamic, diverse, and multicultural work environment</li>\n<li>Leaders with deep market knowledge and strategic vision</li>\n<li>Continuous learning and development</li>\n</ul>\n<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>\n<p><strong>Key Responsibilities:</strong></p>\n<ul>\n<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>\n<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>\n<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>\n<li>Reviews high level design to ensure alignment to Solution Architecture.</li>\n<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>\n<li>Mentor developers and create reference implementations/frameworks.</li>\n<li>Partners with System Architects to elaborate capabilities and features.</li>\n<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>Over 6 years of experience as a senior domain architect for Data domains</li>\n<li>Advanced English Level</li>\n<li>Masters&#39; degree (PLUS)</li>\n<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>\n</ul>\n<p><strong>Technical &amp; Business Skills:</strong></p>\n<ul>\n<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>\n<li>Data Architecture / Data Modeling – Advanced (MUST)</li>\n<li>Data Warehouse – Advanced (MUST)</li>\n<li>Cloud Data Platforms - Advanced</li>\n<li>Data Integration Tools – Advanced</li>\n<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>\n<li>Any Cloud - Intermediate (4-6 Years)</li>\n<li>Power BI or Tableau - Intermediate (4-6 Years)</li>\n<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>\n<li>Data Lakehouse – Intermediate (MUST)</li>\n</ul>\n<ul>\n<li>Data Governance - Intermediate</li>\n<li>AI/ML - Entry Level (PLUS)</li>\n<li>Master Data Management - Intermediate</li>\n<li>Operational Data Management - Intermediate</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7af16166-8fd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/jdUFHSPZZjHsgd3TR4R3BS/remote-fbs-senior-data-domain-architect-in-colombia-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["ETL/ELT Tools (Informatica, DBT)","Data Architecture / Data Modeling","Data Warehouse","Cloud Data Platforms","Data Integration Tools","Snowflake or Databricks","Any Cloud","Power BI or Tableau","Data Science tools (Sagemaker, Databricks)","Data Lakehouse"],"x-skills-preferred":["Data Governance","AI/ML","Master Data Management","Operational Data Management"],"datePosted":"2026-03-09T17:00:36.230Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, AI/ML, Master Data Management, Operational Data Management"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7b03b30a-b20"},"title":"FBS Senior Data Domain Architect","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>\n<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>\n<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>\n<p>**Key Responsibilities:*</p>\n<ul>\n<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>\n</ul>\n<ul>\n<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>\n</ul>\n<ul>\n<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>\n</ul>\n<ul>\n<li>Reviews high level design to ensure alignment to Solution Architecture.</li>\n</ul>\n<ul>\n<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>\n</ul>\n<ul>\n<li>Mentor developers and create reference implementations/frameworks.</li>\n</ul>\n<ul>\n<li>Partners with System Architects to elaborate capabilities and features.</li>\n</ul>\n<ul>\n<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7b03b30a-b20","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/1U952YA2QBa8zK7Tm5d3Lm/remote-fbs-senior-data-domain-architect-in-mexico-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["ETL/ELT Tools (Informatica, DBT)","Data Architecture / Data Modeling","Data Warehouse","Cloud Data Platforms","Data Integration Tools","Snowflake or Databricks","Any Cloud","Power BI or Tableau","Data Science tools (Sagemaker, Databricks)","Data Lakehouse","Data Governance","Master Data Management","Operational Data Management"],"x-skills-preferred":["AI/ML"],"datePosted":"2026-03-09T16:59:14.361Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, Master Data Management, Operational Data Management, AI/ML"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_dcfed817-412"},"title":"FBS Senior Data Domain Architect","description":"<p>We&#39;re looking for a Senior Data Domain Architect to join our team. As a Senior Data Domain Architect, you will design and develop Data/Domain IT architecture solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>\n<p><strong>What to expect on your journey with us:</strong></p>\n<ul>\n<li>A solid and innovative company with a strong market presence</li>\n<li>A dynamic, diverse, and multicultural work environment</li>\n<li>Leaders with deep market knowledge and strategic vision</li>\n<li>Continuous learning and development</li>\n</ul>\n<p><strong>Key Responsibilities:</strong></p>\n<ul>\n<li>Utilize in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery</li>\n<li>Solve complex problems and partner effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization</li>\n<li>Work independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge</li>\n<li>Review high level design to ensure alignment to Solution Architecture</li>\n<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives</li>\n<li>Mentor developers and create reference implementations/frameworks</li>\n<li>Partner with System Architects to elaborate capabilities and features</li>\n<li>Deliver single domain architecture solutions and execute continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>Over 6 years of experience as a senior domain architect for Data domains</li>\n<li>Advanced English Level</li>\n<li>Masters&#39; degree (PLUS)</li>\n<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>\n</ul>\n<p><strong>Technical &amp; Business Skills:</strong></p>\n<ul>\n<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>\n<li>Data Architecture / Data Modeling – Advanced (MUST)</li>\n<li>Data Warehouse – Advanced (MUST)</li>\n<li>Cloud Data Platforms - Advanced</li>\n<li>Data Integration Tools – Advanced</li>\n<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>\n<li>Any Cloud - Intermediate (4-6 Years)</li>\n<li>Power BI or Tableau - Intermediate (4-6 Years)</li>\n<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>\n<li>Data Lakehouse – Intermediate (MUST)</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<ul>\n<li>A competitive salary and performance-based bonuses</li>\n<li>Comprehensive benefits package</li>\n<li>Flexible work arrangements (remote and/or office-based)</li>\n<li>Private Health Insurance</li>\n<li>Paid Time Off</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_dcfed817-412","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/x7tKXYFBB815ca6oBV5T2E/remote-fbs-senior-data-domain-architect-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["ETL/ELT Tools (Informatica, DBT)","Data Architecture / Data Modeling","Data Warehouse","Cloud Data Platforms","Data Integration Tools","Snowflake or Databricks","Any Cloud","Power BI or Tableau","Data Science tools (Sagemaker, Databricks)","Data Lakehouse"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:53:31.425Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_56dc9a51-e66"},"title":"Principal Consultant - Data Architecture","description":"<p><strong>Principal Consultant - Data Architecture</strong></p>\n<p>You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>\n<p><strong>About Your Role</strong></p>\n<p>As a Principal Data Architecture Consultant, you will act as a senior technical leader in complex data and analytics engagements. You will shape and govern end-to-end enterprise data architectures, lead technical teams, and serve as a trusted technical advisor for clients and internal stakeholders.</p>\n<p><strong>Your Role Will Include:</strong></p>\n<ul>\n<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>\n<li>Translate business objectives into scalable, secure, and compliant data solutions</li>\n<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>\n<li>Guide delivery teams through implementation, rollout, and production readiness</li>\n<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>\n<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>\n<li>Support pre-sales and solution design activities from a technical perspective</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>\n<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>\n<li>Strong client-facing experience in complex enterprise environments</li>\n</ul>\n<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>\n<ul>\n<li>Strong expertise in modern data architectures, including:</li>\n<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>\n<li>Modern Data Architecture design principles</li>\n<li>Batch and streaming data integration patterns</li>\n<li>Data Platform, DevOps, deployment and security architectures</li>\n<li>Analytics and AI enablement architectures</li>\n<li>Hands-on experience with cloud data platforms, e.g.:</li>\n<li>Azure, AWS or GCP</li>\n<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>\n<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>\n<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>\n<li>Solid understanding of API-based and event-driven architectures</li>\n<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>\n</ul>\n<p><strong>Engineering &amp; Platform Foundations</strong></p>\n<ul>\n<li>Experience with data pipelines, orchestration, and automation</li>\n<li>Familiarity with CI/CD concepts and production-grade deployments</li>\n<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>\n</ul>\n<p><strong>Data Management &amp; Governance</strong></p>\n<ul>\n<li>Strong understanding of data management and governance principles, including:</li>\n<li>Data quality, metadata, lineage, master data management</li>\n<li>Data Management software and tools</li>\n<li>Security, access control, and compliance considerations</li>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>\n<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>\n<li>Hands-on Experience with data governance or metadata tools</li>\n<li>Cloud, data, or architecture certifications</li>\n</ul>\n<p><strong>Language &amp; Mobility</strong></p>\n<ul>\n<li>Very good English skills</li>\n<li>Willingness to travel for project-related work</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>You will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>\n<p><strong>About Infosys Consulting</strong></p>\n<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>\n<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>\n<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>\n<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_56dc9a51-e66","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Infosys Consulting - Europe","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/hpBWjvvy8D6B1f818cHxZR/remote-principal-consultant---data-architecture-in-poland-at-infosys-consulting---europe","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["enterprise data architecture","system data integration","data engineering","analytics","modern data architectures","Data Mesh/ Data Fabric/ Data lake / data warehouse architectures","Modern Data Architecture design principles","Batch and streaming data integration patterns","Data Platform, DevOps, deployment and security architectures","Analytics and AI enablement architectures","cloud data platforms","Azure","AWS","GCP","Databricks","Snowflake","BigQuery","Azure Synapse / Microsoft Fabric","SQL","relational databases","Postgres","SQL Server","Oracle","NoSQL databases","Cosmos DB","MongoDB","InfluxDB","API-based and event-driven architectures","data migration programmes","data pipelines","orchestration","automation","CI/CD concepts","production-grade deployments","distributed systems","Docker","Kubernetes","data management and governance principles","data quality","metadata","lineage","master data management","data management software and tools","security","access control","compliance considerations","Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience"],"x-skills-preferred":["advanced analytics","AI / ML or GenAI","streaming platforms","Kafka","Azure Event Hubs","data governance or metadata tools","cloud","data","architecture certifications"],"datePosted":"2026-03-09T16:51:22.857Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Poland"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"enterprise data architecture, system data integration, data engineering, analytics, modern data architectures, Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, cloud data platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, SQL, relational databases, Postgres, SQL Server, Oracle, NoSQL databases, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, data migration programmes, data pipelines, orchestration, automation, CI/CD concepts, production-grade deployments, distributed systems, Docker, Kubernetes, data management and governance principles, data quality, metadata, lineage, master data management, data management software and tools, security, access control, compliance considerations, Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience, advanced analytics, AI / ML or GenAI, streaming platforms, Kafka, Azure Event Hubs, data governance or metadata tools, cloud, data, architecture certifications"}]}