<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>586b9fef-509</externalid>
      <Title>Senior Software Engineer - Network Enablement (Applied ML)</Title>
      <Description><![CDATA[<p>We believe that the way people interact with their finances will drastically improve in the next few years. We&#39;re dedicated to empowering this transformation by building the tools and experiences that thousands of developers use to create their own products.</p>
<p>On this team, you will build and operate the ML infrastructure and product services that enable trust and intelligence across Plaid&#39;s network. You&#39;ll own feature engineering, offline training and batch scoring, online feature serving, and real-time inference so model outputs directly power partner-facing fraud &amp; trust products and bank intelligence features.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Embed model inference into Network Enablement product flows and decision logic (APIs, feature flags, backend flows).</li>
<li>Define and instrument product + ML success metrics (fraud reduction, retention lift, false positives, downstream impact).</li>
<li>Design and run experiments and rollout plans (backtesting, shadow scoring, A/B tests, feature-flagged releases) to validate product hypotheses.</li>
<li>Build and operate offline training pipelines and production batch scoring for bank intelligence products.</li>
<li>Ship and maintain online feature serving and low-latency model inference endpoints for real-time partner/bank scoring.</li>
<li>Implement model CI/CD, model/version registry, and safe rollout/rollback strategies.</li>
<li>Monitor model/data health: drift/regression detection, model-quality dashboards, alerts, and SLOs targeted to partner product needs.</li>
<li>Ensure offline and online parity, data lineage, and automated validation / data contracts to reduce regressions.</li>
<li>Optimize inference performance and cost for real-time scoring (batching, caching, runtime selection).</li>
<li>Ensure fairness, explainability and PII-aware handling for partner-facing ML features; maintain auditability for compliance.</li>
<li>Partner with platform and cross-functional teams to scale the ML/data foundation (graph features, sequence embeddings, unified pipelines).</li>
<li>Mentor engineers and document team standards for ML productization and operations.</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>Must-haves:</li>
<li>Strong software engineering skills including systems design, APIs, and building reliable backend services (Go or Python preferred).</li>
<li>Production experience with batch and streaming data pipelines and orchestration tools such as Airflow or Spark.</li>
<li>Experience building or operating real-time scoring and online feature-serving systems, including feature stores and low-latency model inference.</li>
<li>Experience integrating model outputs into product flows (APIs, feature flags) and measuring impact through experiments and product metrics.</li>
<li>Experience with model lifecycle and operations: model registries, CI/CD for models, reproducible training, offline &amp; online parity, monitoring and incident response.</li>
<li>Nice to have:</li>
<li>Experience in fraud, risk, or marketing intelligence domains.</li>
<li>Experience with feature-store products (Tecton / Chronon / Feast / internal) and unified pipelines.</li>
<li>Experience with graph frameworks, graph feature engineering, or sequence embeddings.</li>
<li>Experience optimizing inference at scale (Triton/ONNX/quantization, batching, caching).</li>
</ul>
<p><strong>Additional Information</strong></p>
<p>Our mission at Plaid is to unlock financial freedom for everyone. To support that mission, we seek to build a diverse team of driven individuals who care deeply about making the financial ecosystem more equitable.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$190,800-$286,800 per year</Salaryrange>
      <Skills>software engineering, systems design, APIs, backend services, Go, Python, batch and streaming data pipelines, orchestration tools, Airflow, Spark, real-time scoring, online feature-serving systems, feature stores, low-latency model inference, model outputs, product flows, experiments, product metrics, model lifecycle, operations, model registries, CI/CD, reproducible training, offline &amp; online parity, monitoring, incident response, fraud, risk, marketing intelligence, feature-store products, unified pipelines, graph frameworks, graph feature engineering, sequence embeddings, inference at scale, Triton, ONNX, quantization, batching, caching</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Plaid</Employername>
      <Employerlogo>https://logos.yubhub.co/plaid.com.png</Employerlogo>
      <Employerdescription>Plaid is a technology company that powers the tools millions of people rely on to live a healthier financial life. The company has a presence in multiple countries and works with thousands of companies.</Employerdescription>
      <Employerwebsite>https://plaid.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/plaid/43b1374d-5c5e-4b63-b710-a95e3cb76bbe</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>8e385f8d-94b</externalid>
      <Title>Technical Writer, Aladdin Studio, Associate</Title>
      <Description><![CDATA[<p><strong>About this role</strong></p>
<p>We&#39;re seeking an experienced Technical Writer to help define and deliver world-class documentation for the Aladdin Studio developer platform. Aladdin Studio is transforming how developers and data engineers interact with the Aladdin ecosystem, enabling teams to build, integrate, and extend the Aladdin platform through open APIs and event-driven workflows.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Develop and maintain clear, accurate, and engaging documentation for Aladdin Studio&#39;s APIs, SDKs, and event streaming interfaces.</li>
<li>Collaborate closely with engineers, product managers, and developer experience teams to translate complex technical concepts into approachable guides, tutorials, and reference materials.</li>
<li>Document event-driven workflows, including streaming APIs, webhook subscriptions, and real-time data integration patterns.</li>
<li>Design and implement content structures that scale across multiple APIs, microservices, and event channels within Aladdin Studio.</li>
<li>Contribute to tooling and automation, using OpenAPI/AsyncAPI specs and CI/CD pipelines to generate and version developer documentation.</li>
<li>Partner with Studio Product Marketing and Solution Architecture team to create onboarding materials, sample code, and “getting started” experiences for external and internal developers.</li>
<li>Continuously improve the discoverability and usability of content within the Aladdin Studio Developer Portal.</li>
<li>Champion documentation standards across the Aladdin Product Group, ensuring consistency, clarity, and technical accuracy.</li>
</ul>
<p><strong>Required Qualifications</strong></p>
<ul>
<li>2+ years of experience as a Technical Writer, Developer Advocate, or Software Engineer focused on APIs or event-driven systems.</li>
<li>Deep understanding of RESTful and event-streaming architectures (Real-time distributed event streaming platform, Kinesis, or similar).</li>
<li>Proven experience writing API and developer documentation using OpenAPI/Swagger or AsyncAPI specifications.</li>
<li>Hands-on familiarity with developer tooling such as Git, Postman, Redocly, or similar platforms.</li>
<li>Strong grasp of cloud-based integration concepts, including authentication, webhooks, and event publishing/subscription models.</li>
<li>Excellent written communication skills and ability to translate complex systems into developer-friendly content.</li>
<li>Proficiency in Markdown, YAML, and basic scripting (Python, JavaScript, or similar).</li>
</ul>
<p><strong>What We Offer</strong></p>
<ul>
<li>Opportunity to shape the developer experience for the Aladdin ecosystem, a platform trusted by the world’s largest financial institutions.</li>
<li>A collaborative, growth-oriented environment within BlackRock’s Aladdin Product Group.</li>
<li>Competitive compensation, benefits, and professional development opportunities.</li>
<li>Direct impact on how external developers and partners extend Aladdin’s capabilities through APIs and streaming data.</li>
</ul>
<p><strong>Our benefits</strong></p>
<p>To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment and tools designed to help you in building a sound financial future; access to education reimbursement; comprehensive resources to support your physical health and emotional well-being; family support programs; and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>
<p><strong>Our hybrid work model</strong></p>
<p>BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>API documentation, event streaming, developer experience, OpenAPI/Swagger, AsyncAPI, Git, Postman, Redocly, Markdown, YAML, Python, JavaScript, data integration, analytics, asset and risk management, streaming data pipelines, real-time analytics, API lifecycle management, continuous documentation delivery</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>BlackRock</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>BlackRock is a global investment management company that provides a range of investment products and services to institutional and individual investors.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/gyzyNbsJN1TLzb2rwyj4yv/technical-writer%2C-aladdin-studio%2C-associate-in-edinburgh-at-blackrock</Applyto>
      <Location>Edinburgh, Scotland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>015e5c6d-a31</externalid>
      <Title>Senior Data Engineer</Title>
      <Description><![CDATA[<p><strong>Why Valvoline Global Operations?</strong></p>
<p>At Valvoline Global Operations, we&#39;re proud to be The Original Motor Oil, but we&#39;ve never rested on being first. Founded in 1866, we introduced the world&#39;s first branded motor oil, staking our claim as a pioneer in the automotive and industrial solutions industry.</p>
<p><strong>Job Purpose</strong></p>
<p>We are seeking a highly skilled and motivated Data Engineer to join our growing data and analytics team. The ideal candidate will have strong experience designing and developing scalable data pipelines, integrating complex systems, and optimizing data workflows. Proficiency in Databricks and SAP Datasphere is preferred, as these platforms are central to our data ecosystem.</p>
<p><strong>How You Make an Impact (Job Accountabilities)</strong></p>
<ul>
<li>Design, build, and maintain robust, scalable, and high-performance data pipelines using Databricks and SAP Datasphere.</li>
<li>Collaborate with data architects, analysts, data scientists, and business stakeholders to gather requirements and deliver data solutions aligned with stakeholders&#39; goals.</li>
<li>Integrate diverse data sources (e.g., SAP, APIs, flat files, cloud storage) into the enterprise data platforms</li>
<li>Ensure high standards of data quality and implement data governance practices. Stay current with emerging trends and technologies in cloud computing, big data, and data engineering.</li>
<li>Provide ongoing support for the platform, troubleshoot any issues that arise, and ensure high availability and reliability of data infrastructure.</li>
<li>Create documentation for the platform infrastructure and processes, and train other team members or users in platform effectively.</li>
</ul>
<p><strong>What You Bring to the Role (Job Qualifications / Education / Skills / Requirements / Capabilities)</strong></p>
<ul>
<li>Bachelor&#39;s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field.</li>
<li>5-7+ years of experience in a data engineering or related role.</li>
<li>Strong knowledge of data engineering principles, data warehousing concepts, and modern data architecture.</li>
<li>Proficiency in SQL and at least one programming language (e.g., Python, Scala).</li>
<li>Experience with cloud platforms (e.g., Azure, AWS, or GCP), particularly in data services.</li>
<li>Familiarity with data orchestration tools (e.g., PySpark, Airflow, Azure Data Factory) and CI/CD pipelines.</li>
</ul>
<p><strong>Competencies Desired</strong></p>
<ul>
<li>Hands-on experience with Databricks (including Spark/PySpark, Delta Lake, MLflow, Unity Catalog, etc.).</li>
<li>Practical experience working with SAP Datasphere (or SAP Data Warehouse Cloud) in data modeling and data integration scenarios.</li>
<li>SAP BW or SAP HANA experience is a plus.</li>
<li>Experience with BI tools like Power BI or Tableau.</li>
<li>Understanding of data governance frameworks and data security best practices.</li>
<li>Exposure to data lakehouse architecture and real-time streaming data pipelines.</li>
<li>Certifications in Databricks, SAP, or cloud platforms are advantageous.</li>
</ul>
<p><strong>Working Conditions / Physical Requirements / Travel Requirements</strong></p>
<ul>
<li>Normal Office environment.</li>
<li>Prolonged periods of computer use and frequent participation in meetings</li>
<li>Occasional walking, standing, and light lifting (up to 10 lbs)</li>
</ul>
<ul>
<li>Minimal travel required.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, Databricks, SAP Datasphere, SQL, Python, Scala, cloud platforms, data orchestration tools, CI/CD pipelines, Databricks, SAP Datasphere, SAP BW, SAP HANA, Power BI, Tableau, data governance frameworks, data security best practices, data lakehouse architecture, real-time streaming data pipelines</Skills>
      <Category>Engineering</Category>
      <Industry>Automotive</Industry>
      <Employername>Valvoline Global Operations</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.valvolineglobal.com.png</Employerlogo>
      <Employerdescription>Valvoline Global Operations is a global company that develops future-ready products and provides best-in-class services for the automotive and industrial solutions industry.</Employerdescription>
      <Employerwebsite>https://jobs.valvolineglobal.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.valvolineglobal.com/job/Senior-Data-Engineer/1316654400/</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
  </jobs>
</source>