{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/amazon-rds"},"x-facet":{"type":"skill","slug":"amazon-rds","display":"Amazon Rds","count":2},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9c40b25b-28b"},"title":"FBS Senior Data Engineer (Airflow)","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>We don&#39;t have a local legal entity, so we&#39;ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>\n<p>You can expect a solid and innovative company with a strong market presence, a dynamic, diverse, and multicultural work environment, leaders with deep market knowledge and strategic vision, and continuous learning and development.</p>\n<p>The new data platforms team will be our centralized shared services team supporting all data platforms such as Snowflake and Astronomer. They will be responsible for the strategy and implementation of these platforms as well as best practices for the business units to follow. In this case, the position is focused on Astronomer/Ariflow.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Build and maintain automated data workflows and orchestrations using Apache Airflow</li>\n<li>Implement at least two major end-to-end data pipeline projects using Airflow</li>\n<li>Design and optimize complex DAGs for scalability, maintainability, and reliability</li>\n<li>Create reusable, parameterized, and modular Airflow components (operators, sensors, hooks) to streamline workflow development</li>\n<li>Ensure effective monitoring, alerting, and logging of Airflow DAGs for quick issue resolution</li>\n<li>Document workflows, solutions, and processes for team knowledge sharing and training</li>\n<li>Mentor and support other team members in Airflow usage and adoption</li>\n<li>Explain best practices, identify pros and cons, and communicate technical decisions to team members</li>\n<li>Develop reusable frameworks, leveraging reusable concepts for efficiency and scalability</li>\n<li>Implement and utilize reusable ecosystem components, including Python &amp; Apache Airflow, DynamoDB, Amazon RDS</li>\n<li>Develop reusable frameworks to enforce data governance and data quality standards</li>\n<li>CI/CD pipeline development using re-usable frameworks and Jenkins</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Between 4-6 years of experience in a similar role</li>\n<li>Bachelor&#39;s degree in IT, Information systems, Computer Science or a related field</li>\n<li>Insurance Experience (Desirable)</li>\n<li>Fluency in English</li>\n<li>Availability to work according to CST or PST time zones.</li>\n</ul>\n<p><strong>Technical Skills</strong></p>\n<ul>\n<li>Airflow (MUST) / Astronomer (PLUS) - Advanced (5 Years)</li>\n<li>Python - Advanced (4-6 Years) (MUST)</li>\n<li>Snowflake – Intermediate (MUST)</li>\n<li>DBT - Entry Level (PLUS)</li>\n<li>AWS Glue - Entry Level (PLUS)</li>\n<li>DynamoDB - Intermediate</li>\n<li>Amazon RDS - Intermediate</li>\n<li>Jenkins - Intermediate</li>\n</ul>\n<p><strong>Other Critical Skills</strong></p>\n<ul>\n<li>Work Independently</li>\n<li>Strategic Thinking</li>\n<li>Guide Others</li>\n<li>Documentation</li>\n<li>Explain best practices</li>\n<li>Communicate Technical Decisions</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9c40b25b-28b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/mzMboyMoUyGryzfFUD6uuZ/remote-fbs-senior-data-engineer-(airflow)-in-mexico-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Airflow","Python","Snowflake","DBT","AWS Glue","DynamoDB","Amazon RDS","Jenkins"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:59:05.587Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Airflow, Python, Snowflake, DBT, AWS Glue, DynamoDB, Amazon RDS, Jenkins"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2e1270db-bb7"},"title":"FBS Senior Data Engineer (Airflow)","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>We are looking for a Senior Data Engineer to join our new data platforms team, which will be our centralized shared services team supporting all data platforms such as Snowflake and Astronomer. The position is focused on Astronomer/Ariflow.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Build and maintain automated data workflows and orchestrations using Apache Airflow</li>\n<li>Implement at least two major end-to-end data pipeline projects using Airflow</li>\n<li>Design and optimize complex DAGs for scalability, maintainability, and reliability</li>\n<li>Create reusable, parameterized, and modular Airflow components (operators, sensors, hooks) to streamline workflow development</li>\n<li>Ensure effective monitoring, alerting, and logging of Airflow DAGs for quick issue resolution</li>\n<li>Document workflows, solutions, and processes for team knowledge sharing and training</li>\n<li>Mentor and support other team members in Airflow usage and adoption</li>\n<li>Explain best practices, identify pros and cons, and communicate technical decisions to team members</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Between 4-6 years of experience in a similar role</li>\n<li>Bachelor&#39;s degree in IT, Information systems, Computer Science or a related field</li>\n<li>Insurance Experience (Desirable)</li>\n<li>Fluency in English</li>\n<li>Availability to work according to CST or PST time zones.</li>\n</ul>\n<p><strong>Technical Skills</strong></p>\n<ul>\n<li>Airflow (MUST) / Astronomer (PLUS) - Advanced (5 Years)</li>\n<li>Python - Advanced (4-6 Years) (MUST)</li>\n<li>Snowflake – Intermediate (MUST)</li>\n<li>DBT - Entry Level (PLUS)</li>\n<li>AWS Glue - Entry Level (PLUS)</li>\n<li>DynamoDB - Intermediate</li>\n<li>Amazon RDS - Intermediate</li>\n<li>Jenkins - Intermediate</li>\n</ul>\n<p><strong>Other Critical Skills</strong></p>\n<ul>\n<li>Work Independently</li>\n<li>Strategic Thinking</li>\n<li>Guide Others</li>\n<li>Documentation</li>\n<li>Explain best practices</li>\n<li>Communicate Technical Decisions</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2e1270db-bb7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/g6Kk9PeaSt9vgqEz7dTtY5/remote-fbs-senior-data-engineer-(airflow)-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Airflow","Python","Snowflake","DBT","AWS Glue","DynamoDB","Amazon RDS","Jenkins"],"x-skills-preferred":["Insurance Experience","Fluency in English"],"datePosted":"2026-03-09T16:54:51.948Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Airflow, Python, Snowflake, DBT, AWS Glue, DynamoDB, Amazon RDS, Jenkins, Insurance Experience, Fluency in English"}]}