<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>d6e7c226-e8c</externalid>
      <Title>Technical Lead, MFT MDE Analytics Engineering</Title>
      <Description><![CDATA[<p>The SPEED Market Data team at Equity IT is seeking a hands-on Technical Lead to own and drive a critical workstream focused on architecting, implementing, monitoring, and supporting low-latency C++ systems. As a Technical Lead, you will shape the future of the industry by working alongside exceptional engineers and strategists to solve significant engineering problems.</p>
<p>We are looking for a strong technical leader with financial markets technology experience and real-time market data expertise to design, build, and support our global real-time market data platform. This role emphasizes technical leadership, architectural ownership, and cross-team coordination rather than people management.</p>
<p>Principal Responsibilities:</p>
<ul>
<li>Act as the technical owner for a major market data workstream, setting technical direction, defining architecture, and driving execution across the full lifecycle.</li>
<li>Collaborate with hardware and software teams across divisions to design and build real-time market data processing and distribution systems.</li>
<li>Lead and drive new technical initiatives for the team, including evaluating technologies, defining standards, and establishing best practices.</li>
<li>Design and develop systems, interfaces, and tools for historical market data and trading simulations that increase research productivity.</li>
<li>Architect and implement components of an enterprise market data platform, including components for caching, aggregation, conflation and value-added data enrichment.</li>
<li>Optimise platform performance using network and systems programming, and advanced low-latency techniques (CPU, NIC, kernel, and application-level tuning).</li>
<li>Lead the design and maintenance of automated test and benchmark frameworks, and tools for risk management, performance tracking, and system validation.</li>
<li>Provide technical leadership for the support and operation of both enterprise real-time market data environments, including coordinating internal, vendor, and exchange-driven changes.</li>
<li>Design and engineer components to automate support and management of the market data platform, including monitoring, real-time and historical metrics collection/visualisation, and self-service administrative/user tools.</li>
<li>Serve as a primary technical liaison for users of the market data environment (Portfolio Managers, trading desks, and core technology teams), translating requirements into robust technical solutions.</li>
<li>Lead the enhancement of processes and workflows for operating the market data platform (release/deployment, incident management and remediation, exchange notification handling, defining and enforcing SLAs).</li>
<li>Mentor and influence other engineers through code reviews, design reviews, and hands-on guidance, fostering a culture of technical excellence and accountability.</li>
</ul>
<p>Qualifications / Skills Required:</p>
<ul>
<li>Degree in Computer Science or a related field with a strong background in data structures, algorithms, and object-oriented programming in modern C++.</li>
<li>Deep understanding of Linux system internals and networking, especially in low-latency and high-throughput environments.</li>
<li>Strong knowledge of CPU architecture and the ability to leverage CPU capabilities for performance optimisation.</li>
<li>Demonstrated experience acting as a technical lead or senior engineer owning complex systems or workstreams end-to-end (design, delivery, and operations).</li>
<li>Able to prioritise and make trade-offs in a fast-moving, high-pressure, constantly changing environment; strong sense of urgency, ownership, and follow-through.</li>
<li>Strong belief in and practice of extreme ownership, with a track record of taking accountability for systems in production.</li>
<li>Effective communication and stakeholder management skills: able to work closely with business and technology users, understand their needs, and drive appropriate technical solutions.</li>
<li>Experience building solutions on cloud environments such as GCP and AWS.</li>
<li>Knowledge of additional programming languages such as Java, Python, or scripting (Perl, shell).</li>
<li>Technical background in application development on complex market data systems (e.g., Bloomberg, Thomson Reuters, etc.).</li>
<li>Experience supporting market data environments within a global organisation, including internally developed DMA feed handlers and distribution infrastructure.</li>
<li>Strong understanding of market data concepts and functionality, including data models (fields/messages), protocols (e.g., snapshot + delta), order book representations (L1/L2/L3), recovery, and reliability.</li>
<li>Hands-on Site Reliability Engineering or DevOps experience, including system administration, automation, measurement, and release/deployment management.</li>
<li>Experience with monitoring, metrics, and command/control tooling for distributed market data platforms, with the ability to evaluate existing solutions and drive enhancements across development and operations.</li>
<li>Ability to operate with a high level of thoroughness and attention to detail, demonstrating strong ownership of deliverables and production systems.</li>
</ul>
<p>Millennium pays a total compensation package which includes a base salary, discretionary performance bonus, and a comprehensive benefits package. The estimated base salary range for this position is $175,000 to $250,000, which is specific to New York and may change in the future. When finalising an offer, we take into consideration an individual&#39;s experience level and the qualifications they bring to the role to formulate a competitive total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$175,000 to $250,000</Salaryrange>
      <Skills>C++, Linux system internals, Networking, CPU architecture, Object-oriented programming, Cloud environments, Java, Python, Scripting, Market data systems, Site Reliability Engineering, DevOps, Monitoring, Metrics, Command/control tooling</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Equity IT</Employername>
      <Employerlogo>https://logos.yubhub.co/mlp.eightfold.ai.png</Employerlogo>
      <Employerdescription>Equity IT is a technology company that provides services to the financial industry.</Employerdescription>
      <Employerwebsite>https://mlp.eightfold.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://mlp.eightfold.ai/careers/job/755954905529</Applyto>
      <Location>New York, New York, United States of America</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>be821069-a7f</externalid>
      <Title>Asset Data Engineer</Title>
      <Description><![CDATA[<p>Join the Asset Data team and build the streaming data infrastructure that powers Anchorage&#39;s digital asset platform. You&#39;ll design systems that ingest real-time blockchain and market data from diverse providers, transforming raw feeds into certified, trusted data products.</p>
<p>We&#39;re creating contract-governed supply chains that let us onboard new assets and providers quickly while maintaining the low-latency, high-availability SLOs our business depends on.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Build streaming data pipelines for blockchain data (onchain transactions, staking rewards, validator info) and market data (prices, trades, order books)</li>
<li>Design and implement data contracts and validation gates that enforce quality and schema compliance at ingestion points</li>
</ul>
<p><strong>Complexity and Impact of Work:</strong></p>
<ul>
<li>Collaborate on designing the architecture for standardized ingestion patterns that enable rapid onboarding of new blockchains and market data feeds</li>
<li>Establish redundancy and failover patterns to meet Tier 1 availability and freshness SLOs for critical data products</li>
</ul>
<p><strong>Organizational Knowledge:</strong></p>
<ul>
<li>Collaborate with Protocols, Trading, and Custody teams to understand their data needs and design certified data products with clear SLAs</li>
<li>Partner with Data Platform team on orchestration, storage patterns (BigLake), and metadata management (Atlan)</li>
</ul>
<p><strong>Communication and Influence:</strong></p>
<ul>
<li>Advocate for contract-governed data supply chains and help establish engineering standards for producer patterns across the org</li>
<li>Contribute to architectural decisions and help mature the team&#39;s practices around observability, testing, and operational excellence</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>5-7+ years building streaming or high-throughput data systems: You have experience designing and operating production data pipelines that handle large volumes with low latency and high reliability</li>
<li>Solid backend engineering skills: You&#39;re proficient in Go or Python and have built services that interact with streaming infrastructure (Kafka, pub/sub, websockets, REST APIs)</li>
<li>Blockchain data familiarity: You understand blockchain concepts and are comfortable working with on-chain data (transactions, events, staking, validators) across multiple chains with different data models</li>
<li>Data engineering adjacent skills: You&#39;re comfortable with data transformation patterns, schema evolution, and working with cloud data warehouses (BigQuery) and storage systems (GCS, BigLake)</li>
<li>Operational mindset: You have experience deploying and operating services on cloud platforms (preferably GCP), with strong practices around monitoring, alerting, and incident response</li>
</ul>
<p><strong>Preferred Qualifications:</strong></p>
<ul>
<li>Staking data expertise: You&#39;ve worked with staking rewards, validator data, or proof-of-stake blockchain infrastructure</li>
<li>Market data systems: You&#39;ve built systems that ingest and process market data (prices, trades, order books) from exchanges or data vendors</li>
<li>Infrastructure as code: You have experience with Terraform, Kubernetes, and modern DevOps practices</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Go, Python, Kafka, pub/sub, websockets, REST APIs, blockchain data, data transformation patterns, schema evolution, cloud data warehouses, storage systems, stake data expertise, market data systems, infrastructure as code</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anchorage Digital</Employername>
      <Employerlogo>https://logos.yubhub.co/anchorage.co.png</Employerlogo>
      <Employerdescription>Anchorage Digital is a regulated crypto platform that provides institutions with integrated financial services and infrastructure solutions.</Employerdescription>
      <Employerwebsite>https://www.anchorage.co/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/anchorage/82139746-fb0e-44b9-bbb6-ae078e5d251a</Applyto>
      <Location>New York City</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
  </jobs>
</source>