{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/glue"},"x-facet":{"type":"skill","slug":"glue","display":"Glue","count":8},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ad717304-da7"},"title":"Intern Data Analytics (all genders)","description":"<p>You will be part of the Business Intelligence department, which consists of the Data Science, Data Analytics, and Data Engineering teams.</p>\n<p>This internship provides a great opportunity to gain hands-on experience into Data Analytics. You will work alongside a team of highly skilled and dedicated professionals who are committed to offering strong mentorship and guidance to help you start your career in the field of data.</p>\n<p>Duration: 6 months. Location: Munich, 2-3 office days per week.</p>\n<p><strong>Our Tech Stack</strong></p>\n<ul>\n<li>Database: AWS Stack (Redshift, Athena, Glue, S3).</li>\n<li>Data Pipelines: Airflow, DBT.</li>\n<li>Data Visualization: Looker.</li>\n<li>Data Analytics: SQL, Python.</li>\n<li>Collaboration: Git, Atlassian.</li>\n</ul>\n<p><strong>Your role in this journey</strong></p>\n<p>As a Data Analytics Intern at Holidu, you’ll help our company make smarter, data-driven decisions, while being supported by a Senior Analyst.</p>\n<p>This role goes beyond building dashboards. We want curious, proactive people who want to become data advisors - not only delivering reports, but understanding the business context, which questions they answer and why they matter.</p>\n<ul>\n<li>Collect, analyse, and interpret large datasets to help solve real business challenges.</li>\n<li>Build dashboards and reports using tools like SQL, Python, and Looker.</li>\n<li>Collaborate closely with teams such as Product, Marketing, or Finance to help them extract actionable insights from data.</li>\n<li>Build and improve data pipelines using cutting-edge technologies.</li>\n<li>We are an AI-first team. Rather than manually executing repetitive tasks, you will use AI to work smarter and automate workflows.</li>\n<li>You’ll collaborate with our Data Scientists and get exposure to:</li>\n<li>Data preparation and exploratory data analysis.</li>\n<li>How ML-models are built, evaluated, and deployed in real-life.</li>\n</ul>\n<p><strong>Your backpack is filled with</strong></p>\n<ul>\n<li>Currently enrolled in or recently completed a Bachelor’s or Master’s degree in a quantitative field (e.g., Business Analytics, Data Science, Economics, Statistics, Mathematics, Engineering or similar).</li>\n<li>Understanding of SQL and Python, proficiency in Excel/Google Sheets and a desire to learn visualization tools like Looker.</li>\n<li>Knowledge of Machine Learning and Statistical models is a plus.</li>\n<li>Strong analytical and problem-solving skills, and attention to detail.</li>\n<li>Curiosity to learn and a passion for solving data problems.</li>\n<li>Good communication and presentation skills.</li>\n</ul>\n<p><strong>Our adventure includes</strong></p>\n<ul>\n<li>Compensation: Get a fair salary.</li>\n<li>Impact: Make a difference for hundreds of thousands of monthly users.</li>\n<li>Growth: Take responsibility from day one and develop through regular feedback.</li>\n<li>Community: Engage with international, diverse, yet like-minded colleagues through regular events and 2 office days per week with your team.</li>\n<li>Flexibility: Benefit from our hybrid work policy and the chance to work from other local offices for up to 8 weeks a year.</li>\n<li>Fitness: Get a Urban Sports Club corporate subscription or a premium gym membership at a discounted rate.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ad717304-da7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Holidu Hosts GmbH","sameAs":"https://holidu.jobs.personio.com","logo":"https://logos.yubhub.co/holidu.jobs.personio.com.png"},"x-apply-url":"https://holidu.jobs.personio.com/job/2556233","x-work-arrangement":"hybrid","x-experience-level":"intern","x-job-type":"Internship","x-salary-range":null,"x-skills-required":["SQL","Python","Looker","Git","Atlassian","Airflow","DBT","AWS Stack","Redshift","Athena","Glue","S3"],"x-skills-preferred":[],"datePosted":"2026-04-18T22:13:45.423Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Munich, Germany"}},"employmentType":"INTERN","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, Looker, Git, Atlassian, Airflow, DBT, AWS Stack, Redshift, Athena, Glue, S3"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6690b2fa-cab"},"title":"(Senior) Team Lead Data Analytics (all genders)","description":"<p>At Holidu, data isn&#39;t just a support function, it&#39;s how we make decisions. The Analytics team builds the products and foundations that keep the whole organisation sharp, from day-to-day operations to long-term strategy.</p>\n<p>This role is on-site in Munich, with two office days per week.</p>\n<p>As a Senior Team Lead Data Analytics, you will lead one of Holidu&#39;s core analytics teams, a function at the intersection of data, strategy, and real business impact. The team has four direct reports and entails collaborating cross-functionally with data engineers and data scientists.</p>\n<p>Engage with senior leadership on strategic projects, providing insights that influence product strategy, internal operations, and revenue growth.</p>\n<p>You and your team will support a range of stakeholders across the company (e.g. Customer Support, Host Experience, Sales and Account Management).</p>\n<p>As a member of the BI leadership team, you will help shape the department strategy and the future of AI-powered data products.</p>\n<p>Understand problems and identify opportunities across a diverse range of stakeholder use cases, translating them into analytical requirements and communicating complex findings clearly to both technical and commercial audiences.</p>\n<p>Lead from the front: this role carries meaningful individual contributor responsibility. You&#39;ll be expected to do real analytical work, diving deep into the data, building solutions, and setting the bar for quality in your team.</p>\n<p>Shape the future of analytics at Holidu by recruiting top talent, setting clear goals, and developing your team personally and professionally.</p>\n<p>The ideal candidate will have 5+ years of data analytics experience, people management experience, a collaborative mindset, a mission-driven mentality, excellent analytical and technical skills, and a genuine commitment to AI enablement.</p>\n<p>Impact: Shape the future of travel with products used by millions of guests and thousands of hosts. At Holidu ideas become products, data drives decisions, and iteration fuels fast learning. Your work matters - and you’ll see the impact.</p>\n<p>Learning: Grow professionally in a culture that thrives on curiosity and feedback. You’ll learn from outstanding colleagues, collaborate across disciplines, and benefit from mentorship, and personal learning budgets - with a strong focus on AI.</p>\n<p>Great People: Join a team of smart, motivated and international colleagues who challenge and support each other. We celebrate wins and keep our culture fun, ambitious and human. Our customers are guests and hosts - people we can all relate to - making work meaningful and energizing.</p>\n<p>Technology: Work in a modern tech environment. You’ll experience the pace of a scale-up combined with the stability of a proven business model, enabling you to build, test, and improve continuously.</p>\n<p>Flexibility: Work a hybrid setup with 50% in-office time for collaboration, and spend up to 8 weeks a year from other inspiring locations. You’ll stay connected through regular events and meet-ups across our almost 30 offices.</p>\n<p>Perks on Top: Of course, we also offer travel benefits, gym discounts, and other perks to keep you energized - but what truly sets us apart is the chance to grow in a dynamic industry, alongside amazing people, while having fun along the way.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6690b2fa-cab","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Holidu Hosts GmbH","sameAs":"https://holidu.jobs.personio.com","logo":"https://logos.yubhub.co/holidu.jobs.personio.com.png"},"x-apply-url":"https://holidu.jobs.personio.com/job/2598226","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"Full-time","x-salary-range":null,"x-skills-required":["Database: AWS Stack (Redshift, Athena, Glue, S3)","Data Pipelines: Airflow, dbt","Data Visualisation: Looker","Data Analytics: SQL, Python","Collaboration: Git, Jira, Confluence, Slack"],"x-skills-preferred":[],"datePosted":"2026-04-18T22:13:28.264Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Munich, Germany"}},"employmentType":"FULL_TIME","occupationalCategory":"Technology","industry":"Travel Technology","skills":"Database: AWS Stack (Redshift, Athena, Glue, S3), Data Pipelines: Airflow, dbt, Data Visualisation: Looker, Data Analytics: SQL, Python, Collaboration: Git, Jira, Confluence, Slack"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ba30b234-c68"},"title":"Senior Data Engineer, Payments","description":"<p>We&#39;re looking for a Senior Data Engineer to join our Payments team. As a critical part of our operations, you&#39;ll handle data related to compliance with Tax, Payments, and Legal regulations. You&#39;ll design, build, and maintain robust and efficient data pipelines that collect, process, and store data from various sources, including user interactions, listing details, and external data feeds.</p>\n<p>Your work will involve developing data models that enable the efficient analysis and manipulation of data for merchandising optimization, ensuring data quality, consistency, and accuracy. You&#39;ll also develop high-quality data assets for product use-cases by partnering with Product, AI/ML, and Data Science teams.</p>\n<p>As a Senior Data Engineer, you&#39;ll contribute to creating standards and best practices for Airbnb&#39;s Data Engineering and shape the tools, processes, and standards used by the broader data community. You&#39;ll collaborate with cross-functional teams to define data requirements and deliver data solutions that drive merchandising and sales improvements.</p>\n<p>To succeed in this role, you&#39;ll need 6+ years of relevant industry experience, a BE/B.Tech in Computer Science or a relevant technical degree, and hands-on experience in DSA coding, data structure, and algorithm. You&#39;ll also need extensive experience designing, building, and operating robust distributed data platforms and handling data at the petabyte scale.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ba30b234-c68","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Airbnb","sameAs":"https://www.airbnb.com/","logo":"https://logos.yubhub.co/airbnb.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/airbnb/jobs/7256787","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Scala","Python","data processing technologies","query authoring (SQL)","ETL schedulers (Apache Airflow, Luigi, Oozie, AWS Glue)","data warehousing concepts","relational databases (PostgreSQL, MySQL)","columnar databases (Redshift, BigQuery, HBase, ClickHouse)"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:52:13.348Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bangalore, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Scala, Python, data processing technologies, query authoring (SQL), ETL schedulers (Apache Airflow, Luigi, Oozie, AWS Glue), data warehousing concepts, relational databases (PostgreSQL, MySQL), columnar databases (Redshift, BigQuery, HBase, ClickHouse)"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2a2686d2-290"},"title":"Staff Analytics Engineer","description":"<p>At Twilio, we&#39;re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences.</p>\n<p>Our Data Science and Analytics team seeks to empower R&amp;D to make data-backed decisions that accelerate innovation and improve product performance. You will work closely within our team and across Product &amp; Engineering to design and maintain a robust analytics data layer that enables trusted reporting on R&amp;D metrics.</p>\n<p>In this role, you&#39;ll:</p>\n<ul>\n<li>Design and implement a formal analytics data layer using AWS Glue, Presto, and LookML</li>\n<li>Collaborate within the Data Science &amp; Analytics team and across Product &amp; Engineering to define, document, and maintain alignment on metric definition and data lineage</li>\n<li>Develop and maintain automated data reconciliation and quality checks to proactively identify and resolve discrepancies, ensuring accuracy and consistency of critical reports and dashboards</li>\n<li>Lead investigations into complex data anomalies, conduct root cause analysis, and communicate findings and solutions effectively to both technical and non-technical audiences</li>\n<li>Mentor and guide members of the data science and analytics team, establishing and enforcing best practices around data modeling, testing, documentation, and code review</li>\n</ul>\n<p>Twilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply.</p>\n<p>If your career is just starting or hasn&#39;t followed a traditional path, don&#39;t let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2a2686d2-290","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Twilio","sameAs":"https://www.twilio.com/","logo":"https://logos.yubhub.co/twilio.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/twilio/jobs/7551660","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$155,520 - $194,400 (Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, Vermont or Washington D.C.)\n$164,640 - $205,800 (New York, New Jersey, Washington State, or California (outside of the San Francisco Bay area))\n$182,960 - $228,700 (San Francisco Bay area, California)","x-skills-required":["AWS Glue","Presto","LookML","SQL","data modeling","data pipelines","data reconciliation","data quality checks"],"x-skills-preferred":["Python","distributed computing technologies","Hive","Spark","dashboarding tools","Looker","Tableau"],"datePosted":"2026-04-18T15:43:20.940Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"AWS Glue, Presto, LookML, SQL, data modeling, data pipelines, data reconciliation, data quality checks, Python, distributed computing technologies, Hive, Spark, dashboarding tools, Looker, Tableau","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":155520,"maxValue":228700,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_11099543-51f"},"title":"Software Engineer L3 Phone Numbers","description":"<p>At Twilio, we&#39;re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences.</p>\n<p>Join the team as Twilio&#39;s next Software Engineer L3. This position is that of a Senior Software Engineer to join Twilio&#39;s Messaging Compliance Onboarding team. Programmable Messaging is Twilio&#39;s biggest product. To keep pace with the evolving messaging compliance ecosystem, we need strong engineers that can create innovative solutions to ensure compliance with Twilio partners.</p>\n<p>In this role, you&#39;ll build and maintain multiple compliance program workflows, carrier/ecosystem integrations and customer interactions in the Compliance domain. You will design and develop elegant and scalable solutions across a wide variety of compliance program types including frontend UI experiences and backend APIs, that are highly available and responsive.</p>\n<p>You will work through ambiguity, deliver quickly and with a high quality. Build towards achieving the next generation of architecture vision that empowers expansion of Compliance programs. Interact cross functionally across engineering teams within Twilio to align and build the product and architecture vision.</p>\n<p>Twilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply. If your career is just starting or hasn&#39;t followed a traditional path, don&#39;t let that stop you from considering Twilio.</p>\n<p>We are always looking for people who will bring something new to the table!</p>\n<p>*Required:</p>\n<ul>\n<li>5+ years of experience and a strong fundamental knowledge of software development using JVM languages.</li>\n</ul>\n<ul>\n<li>Experience building web services incorporating best practices for external systems integration, including defensive and hardened approaches to mitigate downstream issues.</li>\n</ul>\n<ul>\n<li>Experience working with highly scalable APIs, high volume data pipelines and large distributed systems.</li>\n</ul>\n<ul>\n<li>Maintaining and operating cloud services.</li>\n</ul>\n<ul>\n<li>An unwillingness to settle for &#39;good enough&#39;, instead staying focused on longevity through well-tested code and continuous improvement.</li>\n</ul>\n<ul>\n<li>Demonstrated commitment to seeking diverse viewpoints and acting with intention to create an inclusive team environment.</li>\n</ul>\n<ul>\n<li>Excellent written and verbal communication skills. Ability to write down and present designs and decisions throughout the development lifecycle, collaborating with engineering and non-engineering roles.</li>\n</ul>\n<p>Desired:</p>\n<ul>\n<li>5+ years of Engineering experience, developing and maintaining high traffic services.</li>\n</ul>\n<ul>\n<li>Familiarity with DynamoDB, SQS, and data integration services like AWS glue</li>\n</ul>\n<ul>\n<li>Familiarity with LLMs, prompt optimizations to improve model accuracy, setting up evaluation pipelines</li>\n</ul>\n<ul>\n<li>Familiarity with Kubernetes, Temporal or similar workflow orchestration</li>\n</ul>\n<ul>\n<li>Experience working with frontend libraries like React or similar.</li>\n</ul>\n<p>Compensation:</p>\n<p>*Please note this role is open to candidates outside of California, Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, New Jersey, New York, Vermont, Washington D.C., and Washington State.</p>\n<p>The estimated pay ranges for this role are as follows:</p>\n<ul>\n<li>Based in Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, Vermont or Washington D.C.: $138,700 - $173,400</li>\n</ul>\n<ul>\n<li>Based in New York, New Jersey, Washington State, or California (outside of the San Francisco Bay area): $146,800 - $183,600</li>\n</ul>\n<ul>\n<li>Based in the San Francisco Bay area, California: $163,100 - $203,900.</li>\n</ul>\n<p>This role may be eligible to participate in Twilio&#39;s equity plan and corporate bonus plan. All roles are generally eligible for the following benefits: health care insurance, 401(k) retirement account, paid sick time, paid personal time off, paid parental leave.</p>\n<p>Applications for this role are intended to be accepted until 4/10/2026, but may change based on business needs.</p>\n<p>Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That&#39;s why we seek out colleagues who embody our values , something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you&#39;re ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!</p>\n<p>If this role isn&#39;t what you&#39;re looking for, please consider other open positions.</p>\n<p>Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_11099543-51f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Twilio","sameAs":"https://www.twilio.com/","logo":"https://logos.yubhub.co/twilio.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/twilio/jobs/7724877","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["software development using JVM languages","web services","external systems integration","highly scalable APIs","high volume data pipelines","large distributed systems","cloud services","well-tested code","continuous improvement","inclusive team environment","written and verbal communication skills"],"x-skills-preferred":["DynamoDB","SQS","AWS glue","LLMs","prompt optimizations","evaluation pipelines","Kubernetes","Temporal","workflow orchestration","React"],"datePosted":"2026-04-18T15:41:58.222Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote - US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"software development using JVM languages, web services, external systems integration, highly scalable APIs, high volume data pipelines, large distributed systems, cloud services, well-tested code, continuous improvement, inclusive team environment, written and verbal communication skills, DynamoDB, SQS, AWS glue, LLMs, prompt optimizations, evaluation pipelines, Kubernetes, Temporal, workflow orchestration, React"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9c40b25b-28b"},"title":"FBS Senior Data Engineer (Airflow)","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>We don&#39;t have a local legal entity, so we&#39;ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>\n<p>You can expect a solid and innovative company with a strong market presence, a dynamic, diverse, and multicultural work environment, leaders with deep market knowledge and strategic vision, and continuous learning and development.</p>\n<p>The new data platforms team will be our centralized shared services team supporting all data platforms such as Snowflake and Astronomer. They will be responsible for the strategy and implementation of these platforms as well as best practices for the business units to follow. In this case, the position is focused on Astronomer/Ariflow.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Build and maintain automated data workflows and orchestrations using Apache Airflow</li>\n<li>Implement at least two major end-to-end data pipeline projects using Airflow</li>\n<li>Design and optimize complex DAGs for scalability, maintainability, and reliability</li>\n<li>Create reusable, parameterized, and modular Airflow components (operators, sensors, hooks) to streamline workflow development</li>\n<li>Ensure effective monitoring, alerting, and logging of Airflow DAGs for quick issue resolution</li>\n<li>Document workflows, solutions, and processes for team knowledge sharing and training</li>\n<li>Mentor and support other team members in Airflow usage and adoption</li>\n<li>Explain best practices, identify pros and cons, and communicate technical decisions to team members</li>\n<li>Develop reusable frameworks, leveraging reusable concepts for efficiency and scalability</li>\n<li>Implement and utilize reusable ecosystem components, including Python &amp; Apache Airflow, DynamoDB, Amazon RDS</li>\n<li>Develop reusable frameworks to enforce data governance and data quality standards</li>\n<li>CI/CD pipeline development using re-usable frameworks and Jenkins</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Between 4-6 years of experience in a similar role</li>\n<li>Bachelor&#39;s degree in IT, Information systems, Computer Science or a related field</li>\n<li>Insurance Experience (Desirable)</li>\n<li>Fluency in English</li>\n<li>Availability to work according to CST or PST time zones.</li>\n</ul>\n<p><strong>Technical Skills</strong></p>\n<ul>\n<li>Airflow (MUST) / Astronomer (PLUS) - Advanced (5 Years)</li>\n<li>Python - Advanced (4-6 Years) (MUST)</li>\n<li>Snowflake – Intermediate (MUST)</li>\n<li>DBT - Entry Level (PLUS)</li>\n<li>AWS Glue - Entry Level (PLUS)</li>\n<li>DynamoDB - Intermediate</li>\n<li>Amazon RDS - Intermediate</li>\n<li>Jenkins - Intermediate</li>\n</ul>\n<p><strong>Other Critical Skills</strong></p>\n<ul>\n<li>Work Independently</li>\n<li>Strategic Thinking</li>\n<li>Guide Others</li>\n<li>Documentation</li>\n<li>Explain best practices</li>\n<li>Communicate Technical Decisions</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9c40b25b-28b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/mzMboyMoUyGryzfFUD6uuZ/remote-fbs-senior-data-engineer-(airflow)-in-mexico-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Airflow","Python","Snowflake","DBT","AWS Glue","DynamoDB","Amazon RDS","Jenkins"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:59:05.587Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Airflow, Python, Snowflake, DBT, AWS Glue, DynamoDB, Amazon RDS, Jenkins"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2e1270db-bb7"},"title":"FBS Senior Data Engineer (Airflow)","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>\n<p>We are looking for a Senior Data Engineer to join our new data platforms team, which will be our centralized shared services team supporting all data platforms such as Snowflake and Astronomer. The position is focused on Astronomer/Ariflow.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Build and maintain automated data workflows and orchestrations using Apache Airflow</li>\n<li>Implement at least two major end-to-end data pipeline projects using Airflow</li>\n<li>Design and optimize complex DAGs for scalability, maintainability, and reliability</li>\n<li>Create reusable, parameterized, and modular Airflow components (operators, sensors, hooks) to streamline workflow development</li>\n<li>Ensure effective monitoring, alerting, and logging of Airflow DAGs for quick issue resolution</li>\n<li>Document workflows, solutions, and processes for team knowledge sharing and training</li>\n<li>Mentor and support other team members in Airflow usage and adoption</li>\n<li>Explain best practices, identify pros and cons, and communicate technical decisions to team members</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Between 4-6 years of experience in a similar role</li>\n<li>Bachelor&#39;s degree in IT, Information systems, Computer Science or a related field</li>\n<li>Insurance Experience (Desirable)</li>\n<li>Fluency in English</li>\n<li>Availability to work according to CST or PST time zones.</li>\n</ul>\n<p><strong>Technical Skills</strong></p>\n<ul>\n<li>Airflow (MUST) / Astronomer (PLUS) - Advanced (5 Years)</li>\n<li>Python - Advanced (4-6 Years) (MUST)</li>\n<li>Snowflake – Intermediate (MUST)</li>\n<li>DBT - Entry Level (PLUS)</li>\n<li>AWS Glue - Entry Level (PLUS)</li>\n<li>DynamoDB - Intermediate</li>\n<li>Amazon RDS - Intermediate</li>\n<li>Jenkins - Intermediate</li>\n</ul>\n<p><strong>Other Critical Skills</strong></p>\n<ul>\n<li>Work Independently</li>\n<li>Strategic Thinking</li>\n<li>Guide Others</li>\n<li>Documentation</li>\n<li>Explain best practices</li>\n<li>Communicate Technical Decisions</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>This position comes with a competitive compensation and benefits package.</p>\n<ul>\n<li>A competitive salary and performance-based bonuses.</li>\n<li>Comprehensive benefits package.</li>\n<li>Flexible work arrangements (remote and/or office-based).</li>\n<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>\n<li>Private Health Insurance.</li>\n<li>Paid Time Off.</li>\n<li>Training &amp; Development opportunities in partnership with renowned companies.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2e1270db-bb7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/g6Kk9PeaSt9vgqEz7dTtY5/remote-fbs-senior-data-engineer-(airflow)-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Airflow","Python","Snowflake","DBT","AWS Glue","DynamoDB","Amazon RDS","Jenkins"],"x-skills-preferred":["Insurance Experience","Fluency in English"],"datePosted":"2026-03-09T16:54:51.948Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Airflow, Python, Snowflake, DBT, AWS Glue, DynamoDB, Amazon RDS, Jenkins, Insurance Experience, Fluency in English"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b2fcfe0b-0dd"},"title":"FBS AWS Data Engineer","description":"<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. This position works on data projects of intermediate complexity to lead in the design, development, and implementation of data products.</p>\n<p>Key Responsibilities\n• Prep and cleanse data to optimize for downstream reporting via Farmers standard visualization or AI/ML tools with coaching and feedback\n• Translate business data stories into a technical story breakdown structure and work estimates for a schedule or planned agile sprint\n• Develop and maintain moderately complex scalable data pipelines for both streaming and batch requirements and build out new API integrations to support increased demands of data volume and complexity\n• Produce data building blocks, data models, and data flows for varying client requests such as dimensional data, standard and ad hoc reporting, data feeds, dashboard reporting, and data science research and exploration\n• Create business user access methods to structured and unstructured data. Utilize techniques such as mapping data to a common data model, natural language processing, transforming data as necessary to satisfy business rules, AI, statistical computations, and validation\n• Responsible for acquiring, curating, and publishing data both on prem and in the cloud for analytical or operational uses for basic to moderate scenarios\n• Ensure the data is in a ready-to-use form that creates a single version of the truth across all data consumers, including business/technology users, reporting and visualization specialists and data scientists with coaching and support\n• Utilize skills to translate business analytic requests/requirements into design, development, testing, deployment, and production maintenance tasks\n• Works with various technologies from big data, relational and non-relational databases, cloud environments, different programming languages and various reporting tools and is familiar with a few but requires training for some</p>\n<p>Requirements\n• 4-6 years of experience in a similar as a Data Engineer with AWS Tools\n• BS in Computer Science or similar\n• Full English Fluency\n• Exp Insurance within finance area (PLUS)</p>\n<p>Technical Experience\n• Python and SQL – Intermediate (MUST)\n• AWS tools such as AWS Glue, S3, AWS Lambda, Iceberg and Lake Formation (MUST)\n• Snowflake - Intermediate (4-6 Years) (MUST)\n• DBT - Entry Level (1-3 Years) (MUST)\n• AWS Cloud Data - Intermediate (4-6 Years) (MUST)\n• MSSQL - Entry Level (1-3 Years) (Desirable)\n• Communications - Intermediate\n• Office Suite - Intermediate\n• Rally - Entry Level or similar\n• Agile - Entry Level, knowledge</p>\n<p>Benefits\nThis position comes with a competitive compensation and benefits package.\n• A competitive salary and performance-based bonuses.\n• Comprehensive benefits package.\n• Flexible work arrangements (remote and/or office-based).\n• You will also enjoy a dynamic and inclusive work culture within a globally renowned group.\n• Private Health Insurance.\n• Paid Time Off.\n• Training &amp; Development opportunities in partnership with renowned companies.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b2fcfe0b-0dd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Capgemini","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/nog4LBbHddk4ZFvf6Bfqdh/remote-fbs-aws-data-engineer-in-brazil-at-capgemini","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","SQL","AWS Glue","S3","AWS Lambda","Iceberg","Lake Formation","Snowflake","DBT","AWS Cloud Data","MSSQL","Communications","Office Suite","Rally","Agile"],"x-skills-preferred":[],"datePosted":"2026-03-09T16:50:42.993Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, SQL, AWS Glue, S3, AWS Lambda, Iceberg, Lake Formation, Snowflake, DBT, AWS Cloud Data, MSSQL, Communications, Office Suite, Rally, Agile"}]}