<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>5c70414d-4e6</externalid>
      <Title>Full‑Stack data engineer</Title>
      <Description><![CDATA[<p>We are seeking a highly self-sufficient, motivated engineer with strong full-stack data engineering skills to join our team. This is a remote/offshore role that requires autonomy, excellent communication, and the ability to deliver high-quality work with limited supervision while collaborating with a predominantly US-based team.</p>
<p>You will build reliable, scalable data products and user experiences that power AI/ML modeling, agentic workflows, and reporting,working end-to-end from data ingestion and transformation through to UI. Our Python-based data platform is undergoing a major evolution toward a modern, cloud-native ELT architecture. We are standardizing on Snowflake as our central data platform and dbt as our core transformation framework, implementing scalable, maintainable ELT practices that simplify ingestion, modeling, and deployment.</p>
<p>This role will be pivotal in independently designing and building robust data pipelines and semantic layers that directly power our AI and machine learning initiatives,delivering clean, reliable, and well-modeled data assets to our data science team for feature engineering, model training, and production inference. You will collaborate closely (primarily via remote channels) with data scientists and ML engineers to ensure our data ecosystem is optimized for experimentation speed, model performance, and seamless integration into downstream products and services.</p>
<p>Key Responsibilities</p>
<ul>
<li>Remote collaboration &amp; communication: Operate effectively as an offshore member of a distributed team, proactively communicating status, risks, and blockers across time zones and coordinating overlap with US working hours as needed.</li>
</ul>
<ul>
<li>Full-stack data engineering: Build across the entire stack, including data ingestion/acquisition and transformation, APIs, front-end components, and automated test suites, delivering production-grade solutions with minimal hand-holding.</li>
</ul>
<ul>
<li>Autonomous delivery &amp; ownership: Take end-to-end ownership of features and projects,clarifying requirements, breaking work into milestones, estimating timelines, and delivering high-quality, well-documented solutions.</li>
</ul>
<ul>
<li>Specification and design: Translate short- and long-term business requirements, architectural considerations, and competing timelines into clear, actionable technical specifications and design documents.</li>
</ul>
<ul>
<li>Code quality: Write clean, maintainable, efficient code that adheres to evolving standards and quality processes, including unit tests and isolated integration tests in containerized environments.</li>
</ul>
<ul>
<li>Continuous improvement: Contribute to agile practices and provide input on technical strategy, architectural decisions, and process improvements, continuously suggesting better tools, patterns, and automation.</li>
</ul>
<p>Required Skills &amp; Experience</p>
<ul>
<li>Professional experience: 5+ years in software engineering, with a full-stack background building complex, scalable data-engineering pipelines using data warehouse technology, SQL with dbt, Python, AWS with Terraform, and modern UI technologies.</li>
</ul>
<ul>
<li>Modern data engineering: Strong experience with medallion data architecture patterns using data warehouse technologies (e.g., Snowflake), data transformation tooling (e.g., dbt), BI tooling, and NoSQL data marts (e.g., Elasticsearch/OpenSearch).</li>
</ul>
<ul>
<li>Testing and QA: Solid understanding of unit testing, CI/CD automation, and quality assurance processes for both data pipeline testing and operational data quality tests.</li>
</ul>
<ul>
<li>Remote work &amp; autonomy: Proven track record working in a remote or distributed environment, demonstrating self-motivation, reliable execution, and the ability to make sound technical decisions independently.</li>
</ul>
<ul>
<li>Agile methodology: Working knowledge of Agile development practices and workflows (e.g., sprint planning, stand-ups, retrospectives) in a distributed team setting.</li>
</ul>
<ul>
<li>Education: Bachelor’s or Master’s degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.</li>
</ul>
<p>Preferred Skills &amp; Experience</p>
<ul>
<li>Machine learning and AI: Hands-on experience with large language models (LLMs) and agentic frameworks/workflows.</li>
</ul>
<ul>
<li>Search and analytics: Familiarity with the ELK stack (Elasticsearch, Logstash, Kibana) for search and analytics solutions.</li>
</ul>
<ul>
<li>Cloud expertise: Experience with AWS cloud services; familiarity with SageMaker; and CI/CD tooling such as GitHub Actions or Jenkins.</li>
</ul>
<ul>
<li>Front-end expertise: Experience building user interfaces with Angular or a modern UI stack.</li>
</ul>
<ul>
<li>Financial domain knowledge: Broad understanding of equities, fixed income, derivatives, futures, FX, and other financial instruments.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, Snowflake, dbt, AWS, Terraform, modern UI technologies, data warehouse technology, SQL, unit testing, CI/CD automation, quality assurance processes, machine learning, AI, large language models, agentic frameworks, ELK stack, search and analytics solutions, cloud expertise, AWS cloud services, SageMaker, CI/CD tooling, front-end expertise, Angular, financial domain knowledge</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>FIC &amp; Risk Technology</Employername>
      <Employerlogo>https://logos.yubhub.co/mlp.eightfold.ai.png</Employerlogo>
      <Employerdescription>FIC &amp; Risk Technology is a technology company that provides risk management solutions.</Employerdescription>
      <Employerwebsite>https://mlp.eightfold.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://mlp.eightfold.ai/careers/job/755955321460</Applyto>
      <Location>Bangalore, Karnataka, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b2f3cc65-797</externalid>
      <Title>Software Engineer II (Sr AI QA Engineer – Python)</Title>
      <Description><![CDATA[<p>We&#39;re looking for an experienced Senior AI QA Engineer to improve our quality engineering for AI/ML solutions and web platforms. You will use your Python expertise to improve test automation, performance, and security frameworks that inspire product excellence.</p>
<p><strong>Role Details</strong></p>
<p><strong>What You Will Do</strong></p>
<ul>
<li>Lead the development of comprehensive evaluation frameworks for AI/ML systems</li>
<li>Develop core AI/ML modules, APIs, and backend services in Python required for framework development</li>
<li>Conduct advanced performance and security testing, resolving systemic issues across AI pipelines and application stacks</li>
<li>Build automation, performance and security test frameworks for web applications and services</li>
<li>Collaborate with product owners, development teams</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Engineering, or related field</li>
<li>Overall 5+ years of experience</li>
<li>4+ years of experience developing AI/ML modules, APIs, and backend services using Python</li>
<li>3+ years of experience testing web applications and web services</li>
<li>2+ years in building test automation frameworks using Python or similar languages</li>
<li>Experience with AWS cloud services (Lambda, S3, EC2, EKS) and containerization tools like Docker</li>
<li>Experience with software quality assurance processes and methodologies</li>
</ul>
<p><strong>Benefits</strong></p>
<p>We adopt a holistic approach to our benefits programs, emphasizing physical, emotional, financial, career, and community wellness to support a balanced life. Our packages are tailored to meet local needs and may include healthcare coverage, mental well-being support, retirement savings, paid time off, family leaves, complimentary games, and more.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, AI/ML, AWS cloud services, containerization tools like Docker, software quality assurance processes and methodologies, performance and security testing, web applications and services</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts is a multinational video game developer and publisher with a portfolio of games and experiences. It has locations around the world and opportunities across various departments.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Software-Engineer-II/212862</Applyto>
      <Location>Hyderabad</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>056148f9-afd</externalid>
      <Title>AI Analyst Intern</Title>
      <Description><![CDATA[<p>We are seeking a dynamic AI Analyst to help drive AI-powered quality initiatives, establish robust data governance frameworks, and develop innovative processes that bring efficiency and increase overall data quality.</p>
<p>Your Contribution:</p>
<ul>
<li>Work with subject matter experts to drive AI Technology into business processes</li>
<li>Help establish and maintain data governance programs across enterprise applications</li>
<li>Lay the foundation for data-based decision utilizing AI Technologies</li>
<li>Work with a team of highly talented individuals to understand and support the data needs of our business.</li>
</ul>
<p>Responsibilities:</p>
<ul>
<li>Work with subject matter experts to drive AI Technology into business processes</li>
<li>Help establish and maintain data governance programs across enterprise applications</li>
<li>Lay the foundation for data-based decision utilizing AI Technologies</li>
<li>Work with a team of highly talented individuals to understand and support the data needs of our business.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Experience building predictive models, especially classification</li>
<li>Excellent understanding of machine learning techniques and AI</li>
<li>Expertise in SQL and Python, experience with NoSQL is a plus</li>
<li>A self-driven ownership mindset with a natural curiosity and excellence in finding solutions to ambiguous problems</li>
<li>Strong analytic skills related to working with unstructured datasets</li>
<li>Experience with EDWs or data lakes a plus</li>
<li>Experience with AWS cloud services</li>
<li>Junior/Senior pursuing a degree in Data Science/Analytics, Computer Science (focus on AI or Machine Learning), Information Systems/AI or related fields</li>
</ul>
<p>Benefits:</p>
<ul>
<li>Flexible work arrangements</li>
<li>Opportunities for professional growth and development</li>
<li>Collaborative and dynamic work environment</li>
<li>Recognition and rewards for outstanding performance</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Predictive models, Machine learning techniques, SQL, Python, NoSQL, Data governance, Data lakes, AWS cloud services, EDWs, Data analytics, Computer science</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Logitech</Employername>
      <Employerlogo>https://logos.yubhub.co/logitech.com.png</Employerlogo>
      <Employerdescription>Logitech is a multinational company that designs and manufactures computer peripherals, software, and mobile communication products.</Employerdescription>
      <Employerwebsite>https://logitech.wd5.myworkdayjobs.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://logitech.wd5.myworkdayjobs.com/en-US/Logitech/job/Camas-Washington---USA/AI-Analyst-Intern_145578</Applyto>
      <Location>Camas, Washington</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
  </jobs>
</source>