<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>65e7bd92-c31</externalid>
      <Title>FBS Analytics Engineer</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>
<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
</ul>
<ul>
<li>A dynamic, diverse, and multicultural work environment</li>
</ul>
<ul>
<li>Leaders with deep market knowledge and strategic vision</li>
</ul>
<ul>
<li>Continuous learning and development</li>
</ul>
<p><strong>Team Function</strong> The Direct modeling team is focused on creating models to guide enterprise marketing decision that will help to promote brand awareness as well as boost sales through direct channel.</p>
<p><strong>Role Description:</strong></p>
<p>This position plays a crucial role in the data ecosystem by iteratively transforming raw data into structured, high-quality datasets that are ready for analysis in partnership with data/decision scientists. The role primarily focuses on moderately complex business problems while receiving limited coaching and guidance from data leadership. The role combines the technical skills of a data engineer, the analytical mindset of a data analyst, and strong business acumen to ensure data is not only collected and stored efficiently but also made accessible and insightful for end users. In partnership with data/decision scientists, the position is responsible for end-to-end data workflow including data ingestion, transformation, modeling, and validation to enable data-driven decision-making across the organization. This position requires deep understanding of data engineering, business processes, and analytics principles as well as a proactive approach to solving complex data challenges.</p>
<p><strong>Essential Job Functions:</strong></p>
<p><strong>1) Data infrastructure development</strong>: Pipeline Design and Development; Architects and builds scalable data pipelines using modern ETL (Extract, Load, Transform) tools and frameworks such as dbt (Data Build Tool), Apache Airflow, or similar. Automates data ingestion processes from various sources including databases, APIs, and third party services. Data Storage and Management - Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery. Optimizes storage solutions for performance, cost efficiency, and scalability.</p>
<p><strong>2) Data modeling and transformation:</strong> Data Modeling - Develops and maintains logical and physical data models to support business analytics. Creates and manages dimensional models, star/snowflake schemas, and other data structures. Data Transformation - Transforms raw data into clean, organized, and analytics-ready datasets using SQL, Python, or other relevant languages. Implements data transformation workflows to handle data cleansing, normalization, and enrichment. Data Quality Assurance - Conducts data validation and consistency checks to ensure the accuracy and reliability of data. Implements data quality monitoring and alerting mechanisms.</p>
<p><strong>3) Collaboration and stakeholder management:</strong> Cross-Functional Collaboration - Works closely with data analysts, data scientists, and business stakeholders to gather requirements and understand their data needs. Acts as a liaison between technical teams and business units to translate business requirements into technical specifications. Technical Communication - Clearly communicates complex technical concepts and data insights to non-technical stakeholders. Provides training and support to team members on data tools, best practices, and methodologies.</p>
<p><strong>Requirements</strong></p>
<ul>
<li>Over 4 years of experience in data development and analytics engineering using Python, SQL, DBT and Snowflake.</li>
</ul>
<ul>
<li>Bachelor’s degree in Computer Science, Data Science, Engineering or other Math or Technology related degrees.</li>
</ul>
<ul>
<li>Fluency in English</li>
</ul>
<p><strong>Software / Tools</strong></p>
<ul>
<li>SQL (must have)</li>
</ul>
<ul>
<li>Python (must have)</li>
</ul>
<ul>
<li>Snowflake (must have)</li>
</ul>
<ul>
<li>DBT (must have)</li>
</ul>
<p><strong>Other Critical Skills</strong></p>
<ul>
<li>Data Transformation</li>
</ul>
<ul>
<li>Data Quality Assurance</li>
</ul>
<ul>
<li>Pipeline Design and Development</li>
</ul>
<ul>
<li>Technical Communication</li>
</ul>
<ul>
<li>Independent work</li>
</ul>
<ul>
<li>Orientation to detail</li>
</ul>
<p><strong>Benefits</strong></p>
<p>This position comes with a competitive compensation and benefits package.</p>
<ul>
<li>A competitive salary and performance-based bonuses.</li>
</ul>
<ul>
<li>Comprehensive benefits package.</li>
</ul>
<ul>
<li>Flexible work arrangements (remote and/or office-based).</li>
</ul>
<ul>
<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>
</ul>
<ul>
<li>Private Health Insurance.</li>
</ul>
<ul>
<li>Paid Time Off.</li>
</ul>
<ul>
<li>Training &amp; Development opportunities in partnership with renowned companies.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Python, Snowflake, DBT, Data Transformation, Data Quality Assurance, Pipeline Design and Development, Technical Communication, Independent work, Orientation to detail</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company that provides a range of services including technology consulting, application services, and business process outsourcing.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/ws76jLTZQ1JKbCcs3CUiC4/remote-fbs-analytics-engineer-in-brazil-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>5aabf454-ae0</externalid>
      <Title>FBS Analytics Engineer</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>
<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Team Function</strong></p>
<p>The Direct modeling team is focused on creating models to guide enterprise marketing decision that will help to promote brand awareness as well as boost sales through direct channel.</p>
<p><strong>Role Description:</strong></p>
<p>This position plays a crucial role in the data ecosystem by iteratively transforming raw data into structured, high-quality datasets that are ready for analysis in partnership with data/decision scientists. The role primarily focuses on moderately complex business problems while receiving limited coaching and guidance from data leadership. The role combines the technical skills of a data engineer, the analytical mindset of a data analyst, and strong business acumen to ensure data is not only collected and stored efficiently but also made accessible and insightful for end users. In partnership with data/decision scientists, the position is responsible for end-to-end data workflow including data ingestion, transformation, modeling, and validation to enable data-driven decision-making across the organization. This position requires deep understanding of data engineering, business processes, and analytics principles as well as a proactive approach to solving complex data challenges.</p>
<p><strong>Essential Job Functions:</strong></p>
<p><strong>1) Data infrastructure development</strong>: Pipeline Design and Development; Architects and builds scalable data pipelines using modern ETL (Extract, Load, Transform) tools and frameworks such as dbt (Data Build Tool), Apache Airflow, or similar. Automates data ingestion processes from various sources including databases, APIs, and third party services. Data Storage and Management - Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery. Optimizes storage solutions for performance, cost efficiency, and scalability.</p>
<p><strong>2) Data modeling and transformation:</strong> Data Modeling - Develops and maintains logical and physical data models to support business analytics. Creates and manages dimensional models, star/snowflake schemas, and other data structures. Data Transformation - Transforms raw data into clean, organized, and analytics-ready datasets using SQL, Python, or other relevant languages. Implements data transformation workflows to handle data cleansing, normalization, and enrichment. Data Quality Assurance - Conducts data validation and consistency checks to ensure the accuracy and reliability of data. Implements data quality monitoring and alerting mechanisms.</p>
<p><strong>3) Collaboration and stakeholder management:</strong> Cross-Functional Collaboration - Works closely with data analysts, data scientists, and business stakeholders to gather requirements and understand their data needs. Acts as a liaison between technical teams and business units to translate business requirements into technical specifications. Technical Communication - Clearly communicates complex technical concepts and data insights to non-technical stakeholders. Provides training and support to team members on data tools, best practices, and methodologies.</p>
<p><strong>Requirements</strong></p>
<ul>
<li>Over 4 years of experience in data development and analytics engineering using Python, SQL, DBT and Snowflake.</li>
<li>Bachelor’s degree in Computer Science, Data Science, Engineering or other Math or Technology related degrees.</li>
<li>Fluency in English</li>
</ul>
<p><strong>Software / Tools</strong></p>
<ul>
<li>SQL (must have)</li>
<li>Python (must have)</li>
<li>Snowflake (must have)</li>
<li>DBT (must have)</li>
</ul>
<p><strong>Other Critical Skills</strong></p>
<ul>
<li>Data Transformation</li>
<li>Data Quality Assurance</li>
<li>Pipeline Design and Development</li>
<li>Technical Communication</li>
<li>Independent work</li>
<li>Orientation to detail</li>
</ul>
<p><strong>Benefits</strong></p>
<p>This position comes with a competitive compensation and benefits package.</p>
<ul>
<li>A competitive salary and performance-based bonuses.</li>
<li>Comprehensive benefits package.</li>
<li>Flexible work arrangements (remote and/or office-based).</li>
<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>
<li>Private Health Insurance.</li>
<li>Paid Time Off.</li>
<li>Training &amp; Development opportunities in partnership with renowned companies.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Python, Snowflake, DBT, Data Transformation, Data Quality Assurance, Pipeline Design and Development, Technical Communication, Independent work, Orientation to detail</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with a diverse collective of nearly 350,000 strategic and technological experts across more than 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/htNwC3gPnBQ9oxedafiBav/remote-fbs-analytics-engineer-in-mexico-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
  </jobs>
</source>