<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>8ba551e0-be3</externalid>
      <Title>Data Analyst - Physical Infrastructure</Title>
      <Description><![CDATA[<p>We are seeking a Data Analyst to join xAI&#39;s Infrastructure team responsible for building and operating world-class datacenters and power generation facilities. In this role, you will analyse power and cooling performance data, develop forecasts for utility consumption and costs, build and maintain business analytics dashboards, and deliver data-driven insights to optimise our rapidly expanding physical infrastructure for AI supercomputing.</p>
<p>Responsibilities: Collect, clean, integrate, and analyse high-volume power, cooling, and energy usage data from datacentre facilities and power plants Build and refine forecasting models for electricity, water, and other utility consumption to support budgeting, planning, and procurement Design, develop, and maintain interactive business intelligence dashboards and reports using tools such as Seeq, Tableau, Power BI, Looker, or similar Identify trends, anomalies, inefficiencies, and optimisation opportunities in power distribution and cooling systems Partner with mechanical, electrical, and facilities engineering teams to translate analytical findings into engineering and operational improvements Support infrastructure expansion planning through scenario analysis, capacity modelling, and cost projections Automate data collection pipelines and reporting processes to enable real-time visibility and decision making Present clear, actionable insights and recommendations to cross-functional teams and leadership</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>Competitive salary and benefits package</Salaryrange>
      <Skills>SQL, Python, Tableau, Power BI, Looker, Statistics, Time-series analysis, Forecasting techniques, Energy, Utilities, Datacentres, Critical infrastructure, Industrial facilities, SCADA systems, Building management systems, IoT sensor data, Cloud data platforms, Power systems, HVAC/cooling efficiency metrics, Energy modelling</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>xAI</Employername>
      <Employerlogo>https://logos.yubhub.co/xai.com.png</Employerlogo>
      <Employerdescription>xAI creates AI systems to understand the universe and aid humanity in its pursuit of knowledge. The organisation operates with a flat structure and expects employees to be hands-on.</Employerdescription>
      <Employerwebsite>https://www.xai.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/xai/jobs/5112514007</Applyto>
      <Location>Memphis, TN</Location>
      <Country></Country>
      <Postedate>2026-04-24</Postedate>
    </job>
    <job>
      <externalid>b699c631-37e</externalid>
      <Title>FBS Data Engineer-ETL (Informatica)</Title>
      <Description><![CDATA[<p>We are looking for a skilled Data Engineer to design, build, and maintain data pipelines that support analytics and business intelligence initiatives. This role involves both enhancing existing pipelines and developing new ones to integrate data from diverse internal and external sources.</p>
<p>The ideal candidate will have advanced SQL and Informatica skills, experience in ETL development, and a foundational understanding of dimensional data modeling. Experience with DBT is a plus.</p>
<p>Key responsibilities include designing, developing, and maintaining data pipelines and ETL workflows, enhancing and optimising existing data pipelines, building new data ingestion pipelines, and using Informatica to develop and manage ETL processes.</p>
<p>The successful candidate will have a bachelor&#39;s degree in Computer Science, Information Systems, or a related field, and 2-4 years of hands-on experience in data engineering or ETL development using Informatica.</p>
<p>They will also have advanced-level proficiency in writing, optimising, and troubleshooting SQL queries, intermediate experience building and managing pipelines using ETL platforms, and at least 3 years using Informatica for data integration tasks.</p>
<p>Excellent problem-solving and communication skills, with the ability to collaborate across teams, are essential for this role.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Informatica, ETL development, dimensional data modeling, DBT, cloud data platforms, AWS, GCP, Azure</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/capgemini.com.png</Employerlogo>
      <Employerdescription>One of the United States&apos; largest insurers, providing a wide range of insurance and financial services products with gross written premiums well over US$25 Billion.</Employerdescription>
      <Employerwebsite>https://www.capgemini.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/pD4BtxSbTed3C7zp5tL7cF/remote-fbs-data-engineer-etl-(informatica)-in-brazil-at-capgemini</Applyto>
      <Location>Brazil</Location>
      <Country></Country>
      <Postedate>2026-04-24</Postedate>
    </job>
    <job>
      <externalid>6c819191-3df</externalid>
      <Title>Data Architect</Title>
      <Description><![CDATA[<p>Do you want to boost your career and collaborate with expert, talented colleagues to solve and deliver against our clients&#39; most important challenges? We are growing and are looking for people to join our team. You&#39;ll be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>
<p>The ideal candidate will have extensive experience in designing and implementing data architectures, with a strong understanding of database management, data modelling, and data governance. This role requires a strategic thinker with strong analytical and problem-solving skills and the ability to work collaboratively with clients and cross-functional teams.</p>
<p>As a Data Architect, you will design and implement robust, scalable, secure, and optimized data solutions that support business requirements and strategic goals. You will evaluate the client&#39;s existing data estate, diagnose underlying issues, and propose potential solutions. You will also collaborate with clients to understand their data needs and provide expert advice on data management and architecture.</p>
<p>Responsibilities:</p>
<ul>
<li>Design and implement data models, data flow diagrams, and data dictionaries</li>
<li>Oversee the ingestion and integration of data from multiple sources into enterprise data platforms</li>
<li>Conduct data quality assessments and implement data governance processes and best practices</li>
<li>Stay updated with the latest trends and technologies in data architecture and management</li>
<li>Provide technical guidance and mentorship to data engineers and other team members</li>
<li>Identify and mitigate data-related risks throughout the project lifecycle</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Proven experience as a Data Architect, with 10+ years of experience in data architecture, database management, and data modelling</li>
<li>Strong knowledge of software development methodologies, tools, and frameworks, particularly Agile</li>
<li>Proficiency in both SQL and NOSQL database management systems (e.g. SQL Server/Oracle/MongoDB, CosmosDB, Snowflake, Databricks)</li>
<li>Hands-on experience with data modelling tools, data warehousing, ETL processes, and data integration techniques</li>
<li>Experience with at least one cloud data platform (e.g. AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark)</li>
</ul>
<p>Given that this is just a short snapshot of the role, we encourage you to apply even if you don&#39;t meet all the requirements listed above. We are looking for individuals who strive to make an impact and are eager to learn.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data architecture, database management, data modelling, data governance, SQL, NOSQL, Agile, cloud data platform, big data technologies</Skills>
      <Category>IT</Category>
      <Industry>Consulting</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/infosys.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting is a globally renowned management consulting firm that works with market leading brands across sectors. Its parent organization, Infosys, is a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://www.infosys.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/gxyLcoL5pitQCmJJzTECEv/remote-data-architect-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-04-24</Postedate>
    </job>
    <job>
      <externalid>94fdb80f-cee</externalid>
      <Title>Cloud Data Engineer</Title>
      <Description><![CDATA[<p>Part of The Brandtech Group, fifty-five is a data consultancy helping brands collect, analyse and activate their data across paid, earned and owned channels to increase their marketing ROI and improve customer experience.</p>
<p>As part of the company&#39;s continued expansion into cloud services in APAC, we are hiring a Cloud Data Engineer to join our Taipei team.</p>
<p>This person will work closely with our clients in Taiwan and the region, collaborating with both our local and global engineering teams.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design and implement data architectures and pipelines for cloud and digital analytics projects on cloud platforms</li>
<li>Deliver hands-on technical services including cloud migration, data transformation, data warehousing, visualization, and advanced analytics</li>
<li>Set up CI/CD pipelines and deployment workflows to ensure proper integration of cloud infrastructure and data pipelines</li>
<li>Streamline and automate processes to optimize performance and cost-efficiency for digital analytics platforms</li>
<li>Support pre-sales activities with local consultants (e.g. demo development, RFP contribution, technical solutioning)</li>
<li>Collaborate with Global Engineering team to develop and deliver POCs for cloud and data-related use cases</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>University degree in Computer Science, Information Systems, or related disciplines</li>
<li>Minimum 1 year of experience with cloud data platforms (GCP preferred; AWS or Azure also welcome)</li>
<li>Familiar with data engineering concepts and tools (e.g. BigQuery, Dataflow, Pub/Sub, Airflow, etc.)</li>
<li>Proficient in one or more programming languages (e.g. Python, Java)</li>
<li>Knowledge of API design, microservices, and DevOps practices (CI/CD, version control, containerization)</li>
<li>Good understanding of data analytics, data warehousing, and visualization (e.g. Looker, Data Studio, Tableau)</li>
<li>Experience with website or mobile app tracking implementation is a plus</li>
<li>Professional cloud certification (GCP, AWS, or Azure) is a plus</li>
<li>Able to communicate technical concepts clearly to non-technical stakeholders</li>
<li>Strong problem-solving skills, self-driven, and collaborative</li>
<li>Fluent in English and Mandarin Chinese</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Exposure to cloud automation, marketing platforms, and media data analytics projects</li>
<li>Opportunity to work with our global consulting and engineering teams to engage our clients from diverse industries around the world</li>
<li>20 days Annual Leave</li>
<li>Work remotely (Maximum 2 days a week Work From Home policy)</li>
<li>Regular team activities including TGIF, team lunch and Off-site!</li>
<li>A multicultural environment with employees from over 20 countries</li>
<li>Values centered on excellence, caring and sharing</li>
<li>Continuous (and certified) training on the digital ecosystem and technologies (initial training for all new employees, followed by ongoing training sessions, etc.)</li>
<li>Particular importance given to work-life balance and the right to disconnect</li>
<li>Work-life balance and strong support for well-being</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>cloud data platforms, BigQuery, Dataflow, Pub/Sub, Airflow, Python, Java, API design, microservices, DevOps practices, data analytics, data warehousing, visualization, website or mobile app tracking implementation, professional cloud certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>fifty-five</Employername>
      <Employerlogo>https://logos.yubhub.co/fifty-five.com.png</Employerlogo>
      <Employerdescription>fifty-five is a data consultancy helping brands collect, analyse and activate their data across paid, earned and owned channels to increase their marketing ROI and improve customer experience. It has over 300 employees globally.</Employerdescription>
      <Employerwebsite>https://www.fifty-five.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/txdbDrb5JndzD9ytk688xh/hybrid-cloud-data-engineer---taiwan-in-taipei-at-fifty-five</Applyto>
      <Location>Taipei</Location>
      <Country></Country>
      <Postedate>2026-04-24</Postedate>
    </job>
    <job>
      <externalid>5973e28c-ce1</externalid>
      <Title>Data Analyst - Physical Infrastructure</Title>
      <Description><![CDATA[<p>We are seeking a Data Analyst to join xAI&#39;s Infrastructure team responsible for building and operating world-class datacenters and power generation facilities. In this role, you will focus on analysing power and cooling performance data, developing forecasts for utility consumption and costs, and delivering data-driven insights to optimise our rapidly expanding physical infrastructure for AI supercomputing.</p>
<p>Responsibilities:</p>
<ul>
<li>Collect, clean, integrate, and analyse high-volume power, cooling, and energy usage data from datacenter facilities and power plants</li>
<li>Build and refine forecasting models for electricity, water, and other utility consumption to support budgeting, planning, and procurement</li>
<li>Design, develop, and maintain interactive business intelligence dashboards and reports using tools such as Seeq, Tableau, Power BI, Looker, or similar</li>
<li>Identify trends, anomalies, inefficiencies, and optimisation opportunities in power distribution and cooling systems</li>
<li>Partner with mechanical, electrical, and facilities engineering teams to translate analytical findings into engineering and operational improvements</li>
<li>Support infrastructure expansion planning through scenario analysis, capacity modelling, and cost projections</li>
<li>Automate data collection pipelines and reporting processes to enable real-time visibility and decision making</li>
<li>Present clear, actionable insights and recommendations to cross-functional teams and leadership</li>
</ul>
<p>Basic Qualifications:</p>
<ul>
<li>4+ years of professional experience in data analysis, business intelligence, or analytics engineering</li>
<li>Strong SQL skills and proficiency in Python (pandas, scikit-learn, or similar) or R for data analysis and modelling</li>
<li>Hands-on experience building dashboards and visualisations with Tableau, Power BI, Looker, or equivalent</li>
<li>Solid foundation in statistics, time-series analysis, and forecasting techniques</li>
<li>Experience working with large datasets and building scalable reporting solutions</li>
<li>Excellent written and verbal communication skills</li>
</ul>
<p>Preferred Skills and Experience:</p>
<ul>
<li>Background in energy, utilities, datacenters, critical infrastructure, or industrial facilities</li>
<li>Familiarity with SCADA systems, building management systems (BMS), or IoT sensor data</li>
<li>Experience with cloud data platforms (Snowflake, BigQuery, AWS/GCP/Azure data services)</li>
<li>Knowledge of power systems, HVAC/cooling efficiency metrics (PUE, WUE, etc.), or energy modelling</li>
<li>Advanced degree in Data Science, Statistics, Engineering, Operations Research, or related quantitative field</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Python, Tableau, Power BI, Looker, Statistics, Time-series analysis, Forecasting, Data analysis, Business intelligence, Analytics engineering, Energy, Utilities, Datacenters, Critical infrastructure, Industrial facilities, SCADA systems, Building management systems, IoT sensor data, Cloud data platforms, Power systems, HVAC/cooling efficiency metrics, Energy modelling</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>xAI</Employername>
      <Employerlogo>https://logos.yubhub.co/xai.com.png</Employerlogo>
      <Employerdescription>xAI creates AI systems to understand the universe and aid humanity in its pursuit of knowledge. The organisation operates with a flat structure and has a small, highly motivated team.</Employerdescription>
      <Employerwebsite>https://www.xai.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/xai/jobs/5112514007</Applyto>
      <Location>Memphis, TN</Location>
      <Country></Country>
      <Postedate>2026-04-24</Postedate>
    </job>
    <job>
      <externalid>b68ff4cc-e74</externalid>
      <Title>Data Engineer, Safeguards</Title>
      <Description><![CDATA[<p><strong>About the role</strong></p>
<p>Anthropic is looking for a Data Engineer to join the Safeguards team and build the data foundations that keep our AI systems safe. The Safeguards team works to monitor models, prevent misuse, and ensure user well-being.</p>
<p>You&#39;ll design and build the data pipelines, warehousing solutions, and analytical tooling that power our safety and trust efforts at scale. You&#39;ll work closely with engineers, data scientists, and policy teams to ensure the Safeguards organization has the data it needs to detect abuse patterns, measure the effectiveness of safety interventions, and make informed decisions about model behavior and enforcement.</p>
<p>This is a high-impact role where your work will directly support Anthropic&#39;s mission to develop AI that is safe and beneficial.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design, build, and maintain scalable data pipelines that support safety monitoring, abuse detection, and enforcement workflows</li>
<li>Develop and optimize data models and warehousing solutions to enable efficient analysis of large-scale usage and safety data</li>
<li>Build and maintain dashboards and reporting infrastructure that give Safeguards teams visibility into model behavior, misuse patterns, and enforcement outcomes</li>
<li>Collaborate with engineers to integrate data from multiple sources , including model outputs, user reports, and automated classifiers , into a unified analytical layer</li>
<li>Implement data quality frameworks, monitoring, and alerting to ensure the reliability of safety-critical data</li>
<li>Partner with research teams to surface data insights that inform model improvements and safety interventions</li>
<li>Develop self-service data tooling that enables stakeholders to explore safety data and generate reports independently</li>
<li>Contribute to data governance practices, including access controls, retention policies, and privacy-compliant data handling</li>
</ul>
<p><strong>You may be a good fit if you:</strong></p>
<ul>
<li>Have 3+ years of experience in data engineering, analytics engineering, or a related role</li>
<li>Are proficient in SQL and Python, with experience building and maintaining ETL/ELT pipelines</li>
<li>Have hands-on experience with modern data stack tools such as dbt, Airflow, Spark, or similar orchestration and transformation frameworks</li>
<li>Have worked with cloud data platforms (BigQuery, Redshift, Snowflake, or similar)</li>
<li>Are comfortable building dashboards and data visualizations using tools like Looker, Tableau, or Metabase</li>
<li>Communicate clearly and can translate complex data concepts for both technical and non-technical audiences</li>
<li>Are results-oriented, flexible, and willing to pick up slack even when it falls outside your job description</li>
<li>Care about the societal impacts of AI and are motivated by safety work</li>
</ul>
<p><strong>Strong candidates may have:</strong></p>
<ul>
<li>Experience with trust &amp; safety, integrity, fraud, or abuse detection data systems</li>
<li>Experience with large-scale event streaming systems (Kafka, Pub/Sub, Kinesis)</li>
<li>Built data infrastructure that supports ML model monitoring or evaluation</li>
<li>A background in statistical analysis, or experience collaborating closely with data scientists</li>
<li>Developed internal tooling or self-service analytics platforms</li>
</ul>
<p><strong>Strong candidates need not have:</strong></p>
<ul>
<li>A formal degree in Computer Science or a related field , we value practical experience and demonstrated ability over credentials</li>
<li>Prior experience in AI or machine learning , you&#39;ll learn the domain-specific context on the job</li>
<li>Previous experience at an AI safety or research organization</li>
<li>Deep expertise across every tool listed above , familiarity with a subset and a willingness to learn is enough</li>
</ul>
<p><strong>Logistics</strong></p>
<p>Minimum education: Bachelor’s degree or an equivalent combination of education, training, and/or experience Required field of study: A field relevant to the role as demonstrated through coursework, training, or professional experience Minimum years of experience: Years of experience required will correlate with the internal job level requirements for the position Location-based hybrid policy: Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices. Visa sponsorship: We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>
<p><strong>How we&#39;re different</strong></p>
<p>We believe that the highest-impact AI research will be big science. At Anthropic we work as a single cohesive team on just a few large-scale research efforts. And we value impact , advancing our long-term goals of steerable, trustworthy AI , rather than work on smaller and more specific puzzles. We view AI research as an empirical science, which has as much in common with physics and biology as with traditional efforts in computer science. We&#39;re an extremely collaborative group, and we host frequent research discussions to ensure that we are pursuing the highest-impact work at any given time. As such, we greatly value communication skills. The easiest way to understand our research directions is to read our recent research. This research continues many of the directions our team worked on prior to Anthropic, including: GPT-3, Circuit-Based Interpretability, Multimodal Neurons, Scaling Laws, AI &amp; Compute, Concrete Problems in AI Safety, and Learning from Human Preferences.</p>
<p><strong>Come work with us!</strong></p>
<p>Anthropic is a public benefit corporation headquartered in San Francisco. We offer competitive compensation and benefits, optional equity donation matching, generous vacation and parental leave, flexible working hours, and a lovely office space in which to collaborate with colleagues.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>£170,000-£220,000 GBP</Salaryrange>
      <Skills>SQL, Python, ETL/ELT pipelines, dbt, Airflow, Spark, cloud data platforms, BigQuery, Redshift, Snowflake, Looker, Tableau, Metabase</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a public benefit corporation that creates reliable, interpretable, and steerable AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5156057008</Applyto>
      <Location>London, UK</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>47807ca3-e36</externalid>
      <Title>Strategic AI/BI Account Executive</Title>
      <Description><![CDATA[<p>We are seeking a Strategic AI/BI Account Executive to help enterprise customers transform how business users interact with data. This high-impact role sits within the AI Go-To-Market team and partners closely with Enterprise Account Executives to drive adoption of Databricks AI/BI and Genie in APJ.</p>
<p>You will help organisations move beyond static dashboards to governed, conversational, AI-powered analytics at the centre of the convergence of business intelligence, data platforms, and generative AI. Enterprise analytics is rapidly evolving from dashboards and static reporting to conversational, AI-driven decision platforms. Databricks AI/BI and Genie empower business users to securely interact with governed data using natural language, transforming the data platform into a true decision platform.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Partner with Enterprise AEs to identify, qualify, and close AI/BI opportunities</li>
<li>Engage C-level, analytics, and line-of-business leaders to modernise analytics strategies</li>
<li>Displace or expand legacy BI platforms with AI-powered, governed analytics solutions</li>
<li>Lead conversations around semantic governance, self-service analytics, and natural language data access</li>
<li>Drive proof-of-value engagements and scale enterprise-wide adoption</li>
<li>Align AI/BI initiatives to measurable business outcomes (productivity, speed to insight, revenue impact)</li>
<li>Enable field teams and serve as a subject matter expert on modern analytics architectures</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>Enterprise sales experience in BI, analytics, data platforms, or AI/ML</li>
<li>Strong understanding of modern analytics architectures and data governance</li>
<li>Ability to sell to both technical and business stakeholders</li>
<li>Executive presence and experience navigating complex buying cycles</li>
<li>Passion for AI and the impact of GenAI on enterprise analytics</li>
<li>Experience operating in a specialist or overlay sales model</li>
<li>Ability to translate technical capabilities into clear business value</li>
<li>7+ years of Enterprise Sales experience, exceeding quotas in larger accounts</li>
</ul>
<p>Preferred qualifications include:</p>
<ul>
<li>Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot</li>
<li>Familiarity with semantic layers, metrics stores, or governed data models</li>
<li>Understanding of lakehouse architectures and cloud data platforms</li>
<li>Exposure to GenAI, natural language interfaces, or conversational applications</li>
<li>Consulting or solution design experience in customer-facing roles</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise sales experience in BI, analytics, data platforms, or AI/ML, Strong understanding of modern analytics architectures and data governance, Ability to sell to both technical and business stakeholders, Executive presence and experience navigating complex buying cycles, Passion for AI and the impact of GenAI on enterprise analytics, Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot, Familiarity with semantic layers, metrics stores, or governed data models, Understanding of lakehouse architectures and cloud data platforms, Exposure to GenAI, natural language interfaces, or conversational applications, Consulting or solution design experience in customer-facing roles</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company with over 10,000 organisations worldwide relying on its data intelligence platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8441884002</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>015afe59-9fd</externalid>
      <Title>Data Analyst II</Title>
      <Description><![CDATA[<p>Why join us</p>
<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>
<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>
<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>
<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>
<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>
<p>Data at Brex</p>
<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>
<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>
<p>What you’ll do</p>
<p>As a Data Analyst II (DA), you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>
<p>You will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses.</p>
<p>This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>
<p>Where you’ll work</p>
<p>This role will be based in our New York office.</p>
<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>
<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>
<p>As a perk, we also have up to four weeks per year of fully remote work!</p>
<p>Responsibilities</p>
<p>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</p>
<p>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</p>
<p>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</p>
<p>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</p>
<p>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</p>
<p>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</p>
<p>Contribute to the automation of recurring analyses and reporting workflows using Python.</p>
<p>Requirements</p>
<p>3+ years of experience in data analytics or a related role in a professional setting.</p>
<p>2+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</p>
<p>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</p>
<p>Experience with Python for data analysis, automation, or scripting.</p>
<p>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</p>
<p>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</p>
<p>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</p>
<p>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</p>
<p>Bonus points</p>
<p>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</p>
<p>Familiarity with dbt for data modeling and transformation.</p>
<p>Exposure to data pipeline orchestration tools (e.g., Airflow).</p>
<p>Experience in fintech, financial services, or payments.</p>
<p>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</p>
<p>Compensation</p>
<p>The expected salary range for this role is $93,600 - $117,000.</p>
<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>
<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$93,600 - $117,000</Salaryrange>
      <Skills>SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, Financial services, Payments</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is an intelligent finance platform that enables companies to spend smarter and move faster in over 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8463702002</Applyto>
      <Location>New York, New York, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>50808499-c0b</externalid>
      <Title>Senior Customer Solutions Resident Architect</Title>
      <Description><![CDATA[<p>About Us</p>
<p>dbt Labs is the pioneer of analytics engineering, helping data teams transform raw data into reliable, actionable insights. Since 2016, we’ve grown from an open source project into the leading analytics engineering platform, now used by over 90,000 teams every week, driving data transformations and AI use cases.</p>
<p>As of February 2025, we’ve surpassed $100 million in annual recurring revenue (ARR) and serve more than 5,400 dbt Platform customers, including AstraZeneca, Sky, Nasdaq, Volvo, JetBlue, and SafetyCulture.</p>
<p>We’re backed by top-tier investors including Andreessen Horowitz, Sequoia Capital, and Altimeter. At our core, we believe in empowering data practitioners:</p>
<ul>
<li>Reliable, high-quality data is the fuel that propels AI-powered data engineering.</li>
</ul>
<ul>
<li>AI is changing data work, fast. dbt’s data control plane keeps data engineers ahead of that curve.</li>
</ul>
<ul>
<li>We empower engineers to deliver reliable, governed data faster, cheaper, and at scale.</li>
</ul>
<p>dbt Labs is now synonymous with analytics engineering, defining the modern data stack and serving as the data control plane for enterprise teams around the world. And we’re just getting started..</p>
<p>We’re growing fast and building a team of passionate, curious people across the globe. Learn more about what makes us special by checking out our values.</p>
<p><strong>About the Role</strong></p>
<p>We are seeking an experienced Senior Customer Solutions Resident Architects to join our team. In this role, you will drive critical customer outcomes by delivering high-impact technical guidance to strategic accounts. You will be part of a high-visibility initiative that supports pre-sales, accelerates adoption, enables key migrations, and mitigates churn risks.</p>
<p>This role is designed to deploy RA-level expertise flexibly, aligning with customer and business needs to drive growth, retention, and expansion.</p>
<p><strong>What You’ll Do</strong></p>
<ul>
<li>Accelerate Customer Success Across the Lifecycle</li>
</ul>
<ul>
<li>Support strategic pre-sales opportunities by providing technical expertise to prospects</li>
</ul>
<ul>
<li>Assist in launching and onboarding new customers who have not purchased RA services, ensuring they successfully adopt dbt Cloud</li>
</ul>
<ul>
<li>Execute proactive adoption plays, including migrations, new feature implementations (e.g., Semantic Layer, Mesh), and major version upgrades</li>
</ul>
<ul>
<li>Lead reactive adoption initiatives to de-risk churn or contraction and position accounts for future growth</li>
</ul>
<ul>
<li>Deliver Technical Excellence</li>
</ul>
<ul>
<li>Advise on architecture, design, implementation, troubleshooting, and best practices in dbt Cloud environments</li>
</ul>
<ul>
<li>Build solution MVPs and guide long-term technical strategies tailored to customer needs</li>
</ul>
<ul>
<li>Engage on multiple projects simultaneously with clear scoping, start and end dates, and outcome tracking</li>
</ul>
<ul>
<li>Collaborate Across Teams</li>
</ul>
<ul>
<li>Partner closely with Customer Solutions Architects (CSAs), Sales, Solutions Architects, Training, and Support</li>
</ul>
<ul>
<li>Provide feedback to Product and Engineering to improve customer experience and prioritize technical needs</li>
</ul>
<ul>
<li>Champion customer success through thoughtful, transparent communication and cross-functional collaboration</li>
</ul>
<ul>
<li>Advance Best Practices and Team Impact</li>
</ul>
<ul>
<li>Help build out and refine this evolving function alongside the broader RA organization</li>
</ul>
<ul>
<li>Track and manage capacity and engagement effectiveness similarly to other RA-led initiatives</li>
</ul>
<p><strong>What You’ll Need</strong></p>
<ul>
<li>5+ years of experience in technical customer-facing roles such as post-sales consulting, technical architecture, or solution delivery</li>
</ul>
<ul>
<li>Expertise with at least one modern cloud data platform (Snowflake, Databricks, BigQuery, or Redshift)</li>
</ul>
<ul>
<li>Hands-on experience deploying or configuring dbt Cloud, with at least 1 year working with dbt</li>
</ul>
<ul>
<li>Strong proficiency in SQL; working knowledge of Python in analytics contexts preferred</li>
</ul>
<ul>
<li>Comfort leading technical project delivery , managing scope, timelines, and stakeholder expectations across multiple simultaneous engagements</li>
</ul>
<ul>
<li>Clear, concise communication skills for both technical and executive audiences</li>
</ul>
<ul>
<li>A collaborative mindset , thriving in a remote, transparent, and highly cross-functional organization</li>
</ul>
<ul>
<li>Willingness to travel 2–4 times per year for company-wide events</li>
</ul>
<p><strong>What Will Make You Stand Out</strong></p>
<ul>
<li>dbt Analytics Engineering Certification</li>
</ul>
<ul>
<li>Ability to influence technical direction and build consensus across internal and customer teams</li>
</ul>
<ul>
<li>Experience with traditional enterprise ETL tools (e.g., Informatica, Datastage, Talend) and how they relate to modern data workflows</li>
</ul>
<ul>
<li>Familiarity with strategic sales or renewal processes, including proactive and reactive adoption efforts</li>
</ul>
<ul>
<li>Proven success accelerating usage, adoption, and expansion in large, complex accounts</li>
</ul>
<p><strong>Remote Hiring Process</strong></p>
<ul>
<li>Interview with a Talent Acquisition Partner</li>
</ul>
<ul>
<li>Interview with Hiring Manager</li>
</ul>
<ul>
<li>Task</li>
</ul>
<ul>
<li>Task Review</li>
</ul>
<ul>
<li>Final Values Interview</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Unlimited vacation time with a culture that actively encourages time off</li>
</ul>
<ul>
<li>401k plan with 3% guaranteed company contribution</li>
</ul>
<ul>
<li>Comprehensive healthcare coverage</li>
</ul>
<ul>
<li>Generous paid parental leave</li>
</ul>
<ul>
<li>Health &amp; wellness stipend</li>
</ul>
<ul>
<li>Flexible stipends for:</li>
</ul>
<ul>
<li>Home office setup</li>
</ul>
<ul>
<li>Learning and development</li>
</ul>
<ul>
<li>Office space</li>
</ul>
<ul>
<li>And more!</li>
</ul>
<p><strong>Compensation</strong></p>
<p>We offer competitive compensation packages commensurate with experience, including salary, equity, and where applicable, performance-based pay. Our Talent Acquisition Team can answer questions around dbt Lab’s total rewards during your interview process.</p>
<p>In Boston, Chicago, Denver, Los Angeles, Philadelphia, New York Metro, San Francisco, DC Metro, Seattle, and Austin, an alternate range may apply, as specified below.</p>
<ul>
<li>The typical starting salary range for this role in the specific locations listed is: $163,000 - $200,000</li>
</ul>
<ul>
<li>The typical starting salary range for this role is: $146,000 - $180,000</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$146,000 - $180,000</Salaryrange>
      <Skills>modern cloud data platform, dbt Cloud, SQL, Python, technical project delivery, clear, concise communication skills, dbt Analytics Engineering Certification, traditional enterprise ETL tools, strategic sales or renewal processes, proven success accelerating usage, adoption, and expansion</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>dbt Labs</Employername>
      <Employerlogo>https://logos.yubhub.co/getdbt.com.png</Employerlogo>
      <Employerdescription>dbt Labs is a company that helps data teams transform raw data into reliable, actionable insights. It has grown from an open source project into the leading analytics engineering platform, now used by over 90,000 teams every week.</Employerdescription>
      <Employerwebsite>https://www.getdbt.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/dbtlabsinc/jobs/4682381005</Applyto>
      <Location>US - Remote</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3d22e39a-bde</externalid>
      <Title>Data Analyst II</Title>
      <Description><![CDATA[<p>Why join us</p>
<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>
<p>Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>
<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry.</p>
<p>We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream.</p>
<p>We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>
<p>Data at Brex</p>
<p>The Data organization develops insights, models, and data infrastructure for teams across Brex, including Sales, Marketing, Product, Engineering, and Operations.</p>
<p>Our Data Scientists, Analysts, and Engineers work together to make data,and insights derived from data,a core asset across the company.</p>
<p>What you’ll do</p>
<p>As a Data Analyst II (DA), you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>
<p>You will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses.</p>
<p>This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>
<p>Where you’ll work</p>
<p>This role will be based in our San Francisco office.</p>
<p>We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home.</p>
<p>We currently require a minimum of three coordinated days in the office per week, Monday, Wednesday and Thursday.</p>
<p>As a perk, we also have up to four weeks per year of fully remote work!</p>
<p>Responsibilities</p>
<p>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</p>
<p>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</p>
<p>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</p>
<p>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</p>
<p>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</p>
<p>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</p>
<p>Contribute to the automation of recurring analyses and reporting workflows using Python.</p>
<p>Requirements</p>
<p>3+ years of experience in data analytics or a related role in a professional setting.</p>
<p>2+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</p>
<p>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</p>
<p>Experience with Python for data analysis, automation, or scripting.</p>
<p>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</p>
<p>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</p>
<p>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</p>
<p>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</p>
<p>Bonus points</p>
<p>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</p>
<p>Familiarity with dbt for data modeling and transformation.</p>
<p>Exposure to data pipeline orchestration tools (e.g., Airflow).</p>
<p>Experience in fintech, financial services, or payments.</p>
<p>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</p>
<p>Compensation</p>
<p>The expected salary range for this role is $93,600 - $117,000.</p>
<p>However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity.</p>
<p>Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$93,600 - $117,000</Salaryrange>
      <Skills>SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, Financial services, Payments</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is an intelligent finance platform that enables companies to spend smarter and move faster in over 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8463696002</Applyto>
      <Location>San Francisco, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7f904cf7-7bd</externalid>
      <Title>Data Analyst II</Title>
      <Description><![CDATA[<p>Join us at Brex, the intelligent finance platform that empowers companies to spend smarter and move faster in over 200 markets. As a Data Analyst II, you will play a central role in enhancing the operational tracking and reporting capabilities of different business teams across Brex.</p>
<p>As a member of our Data organization, you will work closely with Data Scientists, Data Engineers, and partner teams to drive meaningful insights for the business through visualizations, self-service tools, and ad-hoc analyses. This is a high-impact role in a fast-paced fintech environment where your work will directly influence strategic decisions.</p>
<p>Responsibilities:</p>
<ul>
<li>Apply data visualization and storytelling skills in creating business intelligence solutions (such as Looker and/or Hex dashboards) that enable actionable insights.</li>
<li>Perform ad-hoc analyses and deep dives to investigate business questions, surface trends, and provide data-driven recommendations.</li>
<li>Develop self-service data tools and processes that empower business stakeholders to independently monitor the performance and health of their respective areas.</li>
<li>Collaborate closely with Data Scientists and Data Engineers to identify data sources, enable data pipelines, and support the development of analytical data models that operationalize reports and dashboards.</li>
<li>Implement and maintain rigorous data quality checks to ensure the integrity and robustness of datasets used across dashboards, reports, and analyses.</li>
<li>Partner with various departments,including Sales, Operations, Product, and Finance,to understand their data needs and deliver tailored analyses and reporting that support strategic planning.</li>
<li>Contribute to the automation of recurring analyses and reporting workflows using Python.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>4+ years of experience in data analytics or a related role in a professional setting.</li>
<li>3+ years of experience working directly with Sales, Operations, Product, or equivalent business teams.</li>
<li>Fluency in SQL to manipulate data and perform complex analyses (CTEs, window functions, joins across large datasets).</li>
<li>Proficiency in Python for data analysis, automation, and scripting (Pandas, NumPy, and similar libraries).</li>
<li>Experience with business intelligence and data visualization tools (Looker, Hex, Tableau, or similar).</li>
<li>Strong quantitative and analytical skills with a demonstrated ability to translate data into business insights.</li>
<li>Strong communication skills and the ability to work effectively with stakeholders across different functions and levels of technical fluency.</li>
<li>Experience with generative AI and LLM-based tools (Claude Code, Cursor, GitHub Copilot) to perform and accelerate analyses, automated reporting, and build self-service data tools.</li>
</ul>
<p>Bonus points:</p>
<ul>
<li>Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Databricks).</li>
<li>Familiarity with dbt for data modeling and transformation.</li>
<li>Exposure to data pipeline orchestration tools (e.g., Airflow).</li>
<li>Experience in fintech, financial services, or payments.</li>
<li>Comfort operating in a fast-paced, high-growth environment with evolving priorities.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Python, Business Intelligence, Data Visualization, Generative AI, LLM-based tools, Cloud data platforms, dbt, Data pipeline orchestration tools, Fintech, financial services, or payments</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Brex LLC</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is an intelligent finance platform that enables companies to spend smarter and move faster in over 200 markets.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8463703002</Applyto>
      <Location>São Paulo, São Paulo, Brazil</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>30da0df8-cc9</externalid>
      <Title>F&amp;S COE Analyst</Title>
      <Description><![CDATA[<p>We are seeking an analytically-driven Data Analyst to join our Finance &amp; Strategy team at Stripe. This role bridges the gap between data science and financial planning, requiring someone who can transform complex business data into actionable financial insights.</p>
<p>You will build sophisticated dashboards, develop predictive models, and serve as the technical backbone for our FP&amp;A and GTM analytics initiatives. This is a unique opportunity for a data professional with financial acumen to directly influence strategic business decisions in a high-growth fintech environment.</p>
<p><strong>Financial Data Analytics &amp; Modeling</strong></p>
<ul>
<li>Design, build, and maintain financial dashboards for FP&amp;A, Revenue Operations, and GTM teams using Tableau, Power BI, or Looker</li>
<li>Develop automated financial reporting solutions that reduce manual effort and improve data accuracy</li>
<li>Create sophisticated data models to support budgeting, forecasting, variance analysis, and scenario planning</li>
<li>Build predictive models for revenue forecasting, customer lifetime value, churn analysis, and unit economics</li>
</ul>
<p><strong>Business Intelligence &amp; Reporting</strong></p>
<ul>
<li>Partner with Finance Business Partners and FP&amp;A teams to translate business requirements into technical solutions</li>
<li>Design and implement data infrastructure for financial planning cycles (monthly/quarterly reviews, annual budgets, long-range planning)</li>
<li>Develop self-service analytics capabilities enabling finance teams to access real-time business insights</li>
<li>Create executive dashboards tracking key financial and operational metrics (ARR, bookings, retention, CAC, LTV)</li>
</ul>
<p><strong>Data Engineering &amp; Analytics Infrastructure</strong></p>
<ul>
<li>Write complex SQL queries to extract, transform, and analyze large datasets from multiple source systems</li>
<li>Build ETL pipelines to integrate financial data from ERP, CRM, billing, and data warehouse systems</li>
<li>Ensure data quality, consistency, and governance across financial reporting systems</li>
<li>Optimize database performance and data architecture for scalability</li>
</ul>
<p><strong>Strategic Analysis &amp; Insights</strong></p>
<ul>
<li>Conduct deep-dive analyses on business performance, identifying trends, anomalies, and opportunities</li>
<li>Support strategic initiatives through ad-hoc financial modeling and what-if scenario analysis</li>
<li>Translate complex data findings into clear, actionable recommendations for leadership</li>
<li>Collaborate with Data Science teams to develop advanced analytics and ML models for finance use cases</li>
</ul>
<p><strong>Required Qualifications</strong></p>
<ul>
<li>Advanced SQL proficiency (complex joins, window functions, CTEs, query optimization)</li>
<li>Expert-level experience with at least one BI tool (Tableau, Power BI, Looker, or Qlik)</li>
<li>Advanced Excel/Google Sheets skills (pivot tables, complex formulas, data modeling)</li>
<li>Python or R for data analysis, automation, and statistical modeling</li>
<li>Cloud data platforms (Snowflake, BigQuery, Redshift, Databricks)</li>
<li>ETL tools (dbt, Airflow, Fivetran) and version control (Git)</li>
</ul>
<p><strong>Financial &amp; Business Acumen</strong></p>
<ul>
<li>Experience in data analytics within finance, FP&amp;A, or revenue operations functions</li>
<li>Strong understanding of financial statements (P&amp;L, balance sheet, cash flow)</li>
<li>Knowledge of key financial metrics: ARR, MRR, bookings, revenue recognition, CAC, LTV, gross margin, EBITDA</li>
<li>Experience with financial planning processes: budgeting, forecasting, variance analysis, scenario modeling</li>
<li>Understanding of SaaS/subscription business models and revenue recognition principles (ASC 606 preferred)</li>
</ul>
<p><strong>Analytical &amp; Problem-Solving</strong></p>
<ul>
<li>Proven ability to work with large, complex datasets and derive meaningful insights</li>
<li>Experience building financial models and dashboards that drive executive decision-making</li>
<li>Strong statistical analysis skills and understanding of data visualization best practices</li>
<li>Track record of translating ambiguous business problems into structured analytical frameworks</li>
</ul>
<p><strong>Preferred Experience</strong></p>
<ul>
<li>Background in fintech, payments, B2B SaaS, or high-growth technology companies</li>
<li>Experience supporting GTM analytics (sales forecasting, pipeline analysis, quota setting)</li>
<li>Familiarity with finance systems: NetSuite, Anaplan, Adaptive Planning, Salesforce, Stripe Billing</li>
<li>Exposure to data science methodologies and machine learning concepts</li>
<li>Previous work in cross-functional environments collaborating with finance, data science, and business teams</li>
</ul>
<p><strong>Key Competencies</strong></p>
<ul>
<li>Business Acumen: Ability to understand complex business models and translate them into data requirements</li>
<li>Technical Excellence: Deep technical skills with commitment to code quality and best practices</li>
<li>Communication: Exceptional ability to explain technical concepts to non-technical stakeholders</li>
<li>Stakeholder Management: Experience partnering with senior leaders and influencing through data</li>
<li>Ownership Mindset: Self-directed with ability to manage multiple priorities and drive projects to completion</li>
<li>Continuous Learning: Curiosity to learn new tools, techniques, and business domains</li>
<li>Attention to Detail: Commitment to data accuracy and quality in high-stakes financial reporting</li>
</ul>
<p><strong>Education</strong></p>
<ul>
<li>Bachelor&#39;s degree in Finance, Economics, Statistics, Mathematics, Computer Science, Engineering, or related quantitative field</li>
<li>Advanced degree (MBA, MS in Analytics/Data Science) or relevant certifications (CFA, CPA, data analytics certifications) a plus</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Tableau, Power BI, Looker, Python, R, Cloud data platforms, ETL tools, Version control, Machine learning, Data science, Finance systems, Data visualization</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>Stripe</Employername>
      <Employerlogo>https://logos.yubhub.co/stripe.com.png</Employerlogo>
      <Employerdescription>Stripe is a financial infrastructure platform for businesses. It provides payment processing services to millions of companies worldwide.</Employerdescription>
      <Employerwebsite>https://stripe.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/stripe/jobs/7597624</Applyto>
      <Location>Bengaluru</Location>
      <Country></Country>
      <Postedate>2026-03-31</Postedate>
    </job>
    <job>
      <externalid>56dc9a51-e66</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will act as a senior technical leader in complex data and analytics engagements. You will shape and govern end-to-end enterprise data architectures, lead technical teams, and serve as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>You will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>enterprise data architecture, system data integration, data engineering, analytics, modern data architectures, Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, cloud data platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, SQL, relational databases, Postgres, SQL Server, Oracle, NoSQL databases, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, data migration programmes, data pipelines, orchestration, automation, CI/CD concepts, production-grade deployments, distributed systems, Docker, Kubernetes, data management and governance principles, data quality, metadata, lineage, master data management, data management software and tools, security, access control, compliance considerations, Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience, advanced analytics, AI / ML or GenAI, streaming platforms, Kafka, Azure Event Hubs, data governance or metadata tools, cloud, data, architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. It is a mid-size player with a supportive, entrepreneurial spirit.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/hpBWjvvy8D6B1f818cHxZR/remote-principal-consultant---data-architecture-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
  </jobs>
</source>