<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>0594b3f5-9a0</externalid>
      <Title>Software Engineer</Title>
      <Description><![CDATA[<p>Join the Voice &amp; Video Postflight team as Twilio&#39;s next Senior Software Engineer.</p>
<p>This position is needed to build and evolve next-generation distributed systems that empower our customers through high-performance APIs. You will be tasked with solving the complex challenges inherent in supporting the massive scale of Twilio Voice, ensuring our infrastructure remains robust as we expand our capabilities.</p>
<p>As a Software Engineer, you will focus on the intersection of large-scale API development and advanced data systems. You will work on designing and implementing low-latency, highly scalable architectures that leverage modern database technologies to provide customers with seamless access to large-scale data.</p>
<p>Responsibilities:</p>
<p>Architect and implement next-generation distributed systems capable of handling the immense throughput and concurrency requirements of Twilio Voice.</p>
<p>Design low-latency, high-scale APIs that empower customers with real-time access to their data and communications infrastructure.</p>
<p>Optimize and manage distributed database environments, ensuring high availability and performance across high-volume data stores.</p>
<p>Own the full development lifecycle, from initial system design and prototyping to the continuous operation of 24x7 production services.</p>
<p>Collaborate across engineering teams to solve &#39;hard&#39; distributed systems problems, ensuring our API layer is both resilient and developer-friendly.</p>
<p>Qualifications:</p>
<p>A Master&#39;s or Bachelor&#39;s degree and 5+ years of experience in software engineering, with a focus on backend or infrastructure systems.</p>
<p>Expertise in Distributed Systems: A deep understanding of consistency models, partition tolerance, and the challenges of scaling stateful services.</p>
<p>Core Languages: Proficiency in Java, Spring, Dropwizard and a strong grasp of building RESTful APIs at scale.</p>
<p>Database Fundamentals: Practical experience working with and tuning PostgreSQL, Aurora or similar relational databases.</p>
<p>Cloud Infrastructure: Familiarity with deploying and managing large-scale services on AWS or GCP.</p>
<p>Operational Excellence: Comfortable operating in an agile environment with a &#39;you build it, you run it&#39; mentality.</p>
<p>Desired:</p>
<p>OLAP &amp; Big Data: Experience with ClickHouse or other column-oriented databases for high-performance analytical queries.</p>
<p>Infrastructure as a code: Familiarity with tools such as Terraform, Harness for managing systems.</p>
<p>Data Pipelines: Prior exposure to technologies like Kafka or Spark for moving and processing data between distributed systems.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Distributed Systems, Java, Spring, Dropwizard, PostgreSQL, Aurora, AWS, GCP, Operational Excellence, OLAP &amp; Big Data, Infrastructure as a code, Data Pipelines</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Twilio</Employername>
      <Employerlogo>https://logos.yubhub.co/twilio.com.png</Employerlogo>
      <Employerdescription>Twilio delivers innovative solutions to hundreds of thousands of businesses and empowers millions of developers worldwide to craft personalized customer experiences.</Employerdescription>
      <Employerwebsite>https://www.twilio.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/twilio/jobs/7785202</Applyto>
      <Location>Remote - Ireland</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>5093878d-5f9</externalid>
      <Title>C# Application Engineer, Associate</Title>
      <Description><![CDATA[<p>At BlackRock, technology is central to our mission, and our team continues to drive innovation across the industry. We value curiosity, collaboration, and a willingness to experiment to tackle complex problems. As an Associate, Software Engineer on the Enterprise Data Solutions data team, you&#39;ll join an engineering pod responsible for third-party integrations, data feeds, APIs, and incorporating Preqin data into the BlackRock Enterprise Data Platform. Your work will focus on building robust, scalable systems that align with business objectives. You&#39;ll deliver high-quality solutions by applying strong data expertise, product insight, and effective communication skills. Collaboration with other engineers, product managers, and data owners will be key as you help design, develop, and launch new features and influence our technical strategy.</p>
<p>Key responsibilities will include:</p>
<ul>
<li>Designing, implementing, and maintaining robust systems for data distribution to customers and third-party integrations.</li>
<li>Collaborating closely with engineering teams across the organisation to ensure adoption of optimal technical solutions and raising development standards through knowledge sharing and best practice implementation.</li>
<li>Actively contributing to technical discussions regarding new product directions, data modelling, and architectural decisions to ensure the technology platform remains scalable and adaptable.</li>
</ul>
<p>What we are looking for:</p>
<ul>
<li>4+ years’ experience in software engineering.</li>
<li>Strong technical ability across the full stack: C#, Python, FastAPI, React and Typescript are a plus.</li>
<li>Experience with PostgreSQL, MongoDB and other SQL and NoSQL databases (AWS Aurora, Azure Cosmos DB, MS SQL Server, Cassandra are a plus).</li>
<li>Experience with Data Warehouse systems, particularly Snowflake (DataBricks is a plus).</li>
<li>Experience of working within cloud provider services – Azure or AWS and utilization of infrastructure as code (Terraform).</li>
<li>Familiarity with containerisation – Docker and Kubernetes.</li>
<li>Excel plugin development experience is a plus.</li>
<li>Excellent verbal and written communication and interpersonal skills.</li>
<li>A “let’s do it” and “challenge accepted” attitude when faced with challenging tasks, and willingness to learn new technologies and ways of working.</li>
</ul>
<p>Our benefits include retirement investment, education reimbursement, comprehensive resources to support physical health and emotional well-being, family support programs, and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>
<p>Our hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>C#, Python, FastAPI, React, Typescript, PostgreSQL, MongoDB, AWS Aurora, Azure Cosmos DB, MS SQL Server, Cassandra, Snowflake, DataBricks, Terraform, Docker, Kubernetes, Excel plugin development</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>BlackRock</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>BlackRock is a global investment management company that provides a range of investment products and services to institutional and retail clients.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/eqw6vaVybvYbGaFNw8hUeY/c%23%2C-application-engineer%2C-associate-in-london-at-blackrock</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>1c4d3e81-9ab</externalid>
      <Title>Senior Cloud Database Engineer</Title>
      <Description><![CDATA[<p>Join Razer on a global mission to revolutionise the way the world games. As a Senior Cloud Database Engineer, you will be responsible for managing and maintaining databases hosted on cloud platforms such as AWS, MONGODB Atlas, and others.</p>
<p><strong>Job Responsibilities</strong></p>
<ul>
<li>Manage the cloud infrastructure for data, ensuring scalability, availability, and security.</li>
<li>Play a critical role in disaster recovery planning and implementation.</li>
<li>Help companies reduce costs by optimising database performance and usage.</li>
</ul>
<p><strong>Database Management</strong></p>
<p>Oversee the administration, maintenance, and optimisation of AWS cloud databases, including MySQL, Aurora MySQL, MongoDB, MSSQL, and DynamoDB.</p>
<p><strong>Performance Tuning</strong></p>
<p>Monitor and optimise database performance, including managing indexing, queries, and caching strategies.</p>
<p><strong>Backup and Recovery</strong></p>
<p>Implement and manage backup and recovery strategies to safeguard data integrity.</p>
<p><strong>Security</strong></p>
<p>Ensure database security by implementing best practices and monitoring for potential vulnerabilities.</p>
<p><strong>Automation</strong></p>
<p>Develop and maintain automation scripts for database management tasks using tools like AWS Lambda, CloudFormation, and Terraform.</p>
<p><strong>Monitoring</strong></p>
<p>Set up and manage monitoring tools to track database performance and health.</p>
<p><strong>Troubleshooting</strong></p>
<p>Diagnose and resolve database-related issues, providing timely support to development and operations teams.</p>
<p><strong>Collaboration</strong></p>
<p>Work closely with development teams to design and implement database solutions that meet business requirements.</p>
<p><strong>Automating Tasks</strong></p>
<p>Use scripts and cloud tools to automate routine database tasks such as backups, updates, and monitoring.</p>
<p><strong>Migration</strong></p>
<p>Plan and execute data migrations from on-premises databases to cloud platforms or between different cloud services.</p>
<p><strong>Documentation</strong></p>
<p>Maintain comprehensive documentation of database configurations, processes, and procedures.</p>
<p><strong>Pre-Requisites</strong></p>
<ul>
<li>Experience: Minimum of 5 years of experience in database administration and management, with a focus on cloud-hosted databases: AWS and MONGO Atlas.</li>
<li>Technical Skills: Proficiency in MySQL, Aurora MySQL, MongoDB, PostgreSQL, MSSQL, and DynamoDB. Experience with AWS services such as RDS, EC2, S3, and IAM.</li>
<li>Scripting: Strong scripting skills in languages such as Python, Bash, or PowerShell.</li>
<li>Problem-Solving: Excellent analytical and problem-solving skills.</li>
<li>Communication: Strong communication and collaboration skills, with the ability to work effectively in a team environment.</li>
<li>Certifications: AWS Certified Database - Specialty or similar certifications are a plus.</li>
<li>Vietnamese citizen based in Ho Chi Minh City.</li>
</ul>
<p><strong>Preferred Qualifications</strong></p>
<ul>
<li>Experience with database migration and upgrade projects.</li>
<li>Knowledge of containerisation technologies like Docker and Kubernetes.</li>
<li>Familiarity with DevOps practices and CI/CD pipelines.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement></Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>MySQL, Aurora MySQL, MongoDB, PostgreSQL, MSSQL, DynamoDB, AWS services, Python, Bash, PowerShell, database migration, containerisation technologies, DevOps practices, CI/CD pipelines</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Razer</Employername>
      <Employerlogo>https://logos.yubhub.co/razer.com.png</Employerlogo>
      <Employerdescription>Razer is a global organisation that designs and manufactures gaming peripherals and laptops. It has a global team located across 5 continents.</Employerdescription>
      <Employerwebsite>https://razer.wd3.myworkdayjobs.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://razer.wd3.myworkdayjobs.com/en-US/Careers/job/Ho-Chi-Minh/Senior-Database-Administrator_JR2026006852</Applyto>
      <Location>Ho Chi Minh</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>4892788f-14a</externalid>
      <Title>Senior Backend Engineer (Infrastructure)</Title>
      <Description><![CDATA[<p><strong>Compensation</strong></p>
<p>$230K – $280K • Offers Equity</p>
<p><strong>Why this is a massive opportunity:</strong></p>
<p>We&#39;re using AI to codify the world&#39;s international trade law. Every business around the world needs this. Compliance isn’t optional. Strong macro tailwinds: Businesses are expanding globally at a faster pace, while tax authorities are cracking down on cross-border compliance, driving urgent demand. AI advantage: Incumbents can’t scale beyond the U.S.; our AI-native tax engine can. Multi product play: Indirect tax is just the start; we’re building a compound startup that helps business with all forms of revenue-based compliance.</p>
<p><strong>What you will do:</strong></p>
<p>Within weeks:</p>
<ul>
<li>Find solutions to Sphere&#39;s toughest scaling, performance, and latency problems</li>
</ul>
<ul>
<li>Work closely with engineering team to define tooling to help us ship even faster</li>
</ul>
<ul>
<li>Participate in an On Call rotation to solve critical production events</li>
</ul>
<ul>
<li>Work directly with customers like Eleven Labs, Replit, Windsurf, and partners like Stripe, Chargebee, on their latency and availability requirements.</li>
</ul>
<p>Within months:</p>
<ul>
<li>Influence and implement the next generation of Sphere&#39;s database, real-time queue, and container orchestration infrastructure</li>
</ul>
<ul>
<li>Work across our engineering organization to introduce and scale best practices with cloud-native technologies like Amazon ALB, ECS/EKS, Temporal, AWS SQS, Amazon Aurora PostgreSQL, Elasticache Redis, and S3</li>
</ul>
<ul>
<li>Build abstractions within Terraform to simplify the architecture and increase velocity and ownership</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Experience managing k8s clusters in AWS/GCP/Azure at scale</li>
</ul>
<ul>
<li>Extensive experience shipping high-quality architectures for mission critical systems (focus on high availability, high load, low latency)</li>
</ul>
<ul>
<li>Experience with Postgres at scale</li>
</ul>
<p><strong>Nice to have:</strong></p>
<ul>
<li>Experience working with large volumes of transaction data. You’ll be getting very familiar with it!</li>
</ul>
<ul>
<li>Strong experience in Python. Our core application backend and data pipeline services are built with Python and Django</li>
</ul>
<ul>
<li>Passionate about developer experience</li>
</ul>
<ul>
<li>Very strong attention to detail. When you work with numbers this is a non-negotiable - it’s not enough to be 99% right.</li>
</ul>
<p><strong>What’s important to us:</strong></p>
<ul>
<li>The Dog: Grit &gt; pedigree.</li>
</ul>
<ul>
<li>Ship fast: If it can be done today, do it today.</li>
</ul>
<ul>
<li>Self-starter: No hand-holding</li>
</ul>
<ul>
<li>Accountability: You’ll be held accountable to objective, measurable targets each month.</li>
</ul>
<ul>
<li>In-person: SF only.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>Full time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>On-site</Workarrangement>
      <Salaryrange>$230K – $280K</Salaryrange>
      <Skills>k8s, AWS/GCP/Azure, Postgres, Python, Django, Temporal, AWS SQS, Amazon Aurora PostgreSQL, Elasticache Redis, S3, experience working with large volumes of transaction data, strong experience in Python, passionate about developer experience, very strong attention to detail</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Sphere</Employername>
      <Employerlogo>https://logos.yubhub.co/sphere.com.png</Employerlogo>
      <Employerdescription>Sphere is building AI-native trade infrastructure to help companies automate their sales tax, VAT, and GST compliance obligations. They have spent the last 2 years building local rails into tax authorities in both the US and internationally.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/sphere/c7b17c18-2dde-4b83-a8a5-ff5effe94dd2</Applyto>
      <Location>San Francisco HQ</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
  </jobs>
</source>