<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>038b4893-89b</externalid>
      <Title>IT Audit Lead</Title>
      <Description><![CDATA[<p>We are seeking an IT Audit Lead to join our Management Controls and Internal Audit Group. As an IT Audit Lead, you will be responsible for leading IT audit engagements, planning and carrying out the audit, and continuously working to improve processes and procedures. You will work closely with the Head of Information Technology Audit to develop and maintain an in-depth understanding of the technology organization, business areas, and support functions.</p>
<p>Primary Responsibilities:</p>
<ul>
<li>Lead and perform IT and integrated audit engagements, with support from IT Auditors, focusing on IT core infrastructure, trade execution and trade processing infrastructure, critical applications, and IT general controls;</li>
<li>Build and maintain relationships with key stakeholders, establishing a culture of engagement while adding value;</li>
<li>Develop and maintain an in-depth understanding of the technology organization, business areas, and support functions;</li>
<li>Support the Head of Information Technology Audit with audit planning, scope design, internal control assessment, raising and reporting of issues, and monitoring of remediation plans;</li>
<li>Participate in department-wide initiatives focused on continually improving firm processes and the control environment;</li>
<li>Assist with annual risk assessment process, audit plan creation, and other departmental administrative projects.</li>
</ul>
<p>Qualifications/Skills Required:</p>
<ul>
<li>12+ years of IT audit experience with exposure to core IT infrastructure, cyber security, equities trading, fixed-income trading, operations, and/or trade support functions;</li>
<li>Strong analytical and reporting skills and effective relationship-building experience;</li>
<li>Effective communication (verbal and written) and inter-personal skills, with the ability to present sophisticated and sensitive issues to management and inspire change;</li>
<li>Knowledge and experience of core IT infrastructure platforms (e.g., Windows, Unix, Sybase, SQL), cyber security, cloud technology, networks, firewalls, and/or data analytics;</li>
<li>Extensive knowledge of the audit lifecycle and the evaluation of IT general controls and IT automated controls;</li>
<li>Bachelor’s degree in Information Systems, Computer Science/Engineering, or other relevant fields;</li>
<li>A related certification (e.g., CISA, CISSP, CIA) is desired;</li>
<li>Domestic and international travel requirements: 0%-10%.</li>
</ul>
<p>The estimated base salary range for this position is $160,000 to $250,000, which is specific to New York and may change in the future.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$160,000 to $250,000</Salaryrange>
      <Skills>IT audit experience, core IT infrastructure, cyber security, equities trading, fixed-income trading, operations, trade support functions, analytical and reporting skills, relationship-building experience, communication (verbal and written) and inter-personal skills, knowledge of core IT infrastructure platforms, cloud technology, networks, firewalls, data analytics, audit lifecycle, IT general controls, IT automated controls</Skills>
      <Category>IT</Category>
      <Industry>Finance</Industry>
      <Employername>Audit</Employername>
      <Employerlogo>https://logos.yubhub.co/mlp.eightfold.ai.png</Employerlogo>
      <Employerdescription>Millennium is a company that exists to assist with compliance, legal, and ethics oversight.</Employerdescription>
      <Employerwebsite>https://mlp.eightfold.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://mlp.eightfold.ai/careers/job/755953849622</Applyto>
      <Location>New York, New York, United States of America</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>e8abf445-c26</externalid>
      <Title>Staff Applied AI Engineer, Enterprise GenAI</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Staff Applied AI Engineer to join our Enterprise Engineering team. As an Applied AI Engineer, you&#39;ll work with clients to create ML solutions to satisfy their business needs. Your work will range from building next-generation AI cybersecurity firewalls to creating transformative AI experiences in journalism to applying foundation genomic models making predictions about life-saving drug proteins.</p>
<p>Daily data-driven experiments will provide key insights around model strengths and inefficiencies which you&#39;ll use to improve your product&#39;s performance. If you are excited about shaping the future of the modern AI movement, we would love to hear from you!</p>
<p>You will:</p>
<ul>
<li>Own, plan, and optimize the AI behind our Enterprise customer&#39;s deepest technical problems</li>
<li>Leverage SGP to build the most advanced AI agents across the industry including multimodal functionality, tool-calling, and more</li>
<li>Have experience gathering business requirements and translating them into technical solutions</li>
<li>Meet regularly with customer teams onsite and virtually, collaborating cross-functionally with all teams responsible for their data and ML needs</li>
<li>Push production code in multiple development environments, writing and debugging code directly in both our customer&#39;s and Scale&#39;s codebases.</li>
</ul>
<p>Ideally you&#39;d have:</p>
<ul>
<li>7+ years of full-time engineering experience, post-graduation</li>
<li>A love for solving deeply complex technical problems with ambiguity using state of the art research and AI to accomplish your client’s business goals</li>
<li>Strong engineering background: a Bachelor’s degree in Computer Science, Mathematics, or another quantitative field or equivalent strong engineering background.</li>
<li>Deep familiarity with a data-driven approach when iterating on machine learning models and how changes in datasets can influence model results</li>
<li>Experience working with cloud technology stack (eg. AWS or GCP) and developing machine learning models in a cloud environment</li>
<li>Proficiency in Python to write, test and debug code using common libraries (ie numpy, pandas)</li>
</ul>
<p>Nice to haves:</p>
<ul>
<li>Strong knowledge of software engineering best practices</li>
<li>Have built applications taking advantage of Generative AI in real, production use cases</li>
<li>Familiarity with state of the art LLMs and their strengths/weaknesses</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$216,000-$270,000 USD</Salaryrange>
      <Skills>Python, Machine Learning, Cloud Technology Stack, Data-Driven Approach, Software Engineering Best Practices, Generative AI, State of the Art LLMs</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale is an AI data foundry that helps fuel advancements in AI, including generative AI, defense applications, and autonomous vehicles.</Employerdescription>
      <Employerwebsite>https://scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4683689005</Applyto>
      <Location>San Francisco, CA; Seattle, WA; New York, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0cd9dcc6-813</externalid>
      <Title>Solutions Engineer, Enterprise</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Solutions Engineer to join our Enterprise team. As a Solutions Engineer, you will play a vital role in the development of AI applications. You will partner closely with Account Executives, Product, and Machine Learning Engineers to lead prospective customers through pre-sales, delivering customized demos and pilots to secure the &#39;technical win&#39;. You will scope customer technical requirements and develop an actionable Statement of Work. You will work closely with the delivery team to help with initial implementation.</p>
<p>Key responsibilities include: Partner with Scale AEs on the customer journey, delivering tailored demos and prototypes according to the customer&#39;s requirements. Develop technical domain expertise in Generative AI / large language model applications for Enterprise use cases, including customers in financial services, insurance, SaaS, and similar enterprises. Be accountable for securing the &#39;technical win&#39; by unblocking technical challenges Interact with customers daily to understand their needs and design solutions to better serve them. Design and develop &#39;Scopes of Work&#39; by breaking down customer challenges into a project plan Work closely with forward-deployed Software and Machine learning Engineers to develop agents in the initial post-sales stage Work with AEs and PMs to identify customer-specific feature requests. Drive strategic initiatives to improve the efficiency and effectiveness of the Solution Engineering team.</p>
<p>Ideal candidate will have: Strong engineering background with prior experience working with clients in a pre or post-sales capacity to realise business goals. Prior experience developing with Python, Java and/or other web development languages. Experience working in enterprise SaaS, cloud tech, finance, fintech or similar industries in a technical capacity with end-customer engagement. A track record as a self-starter, motivated to independently unblock technical issues in the field with the customer, away from the mothership. Presentation skills with a high degree of technical credibility when speaking with executives and front-line engineers. High level of comfort communicating effectively across internal and external organisations. Intellectual curiosity, empathy, and ability to operate with high velocity.</p>
<p>Nice to have: GenAI Experience Forward deployed engineering experience Machine Learning Experience</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Generative AI, large language model applications, Python, Java, web development languages, enterprise SaaS, cloud tech, finance, fintech, GenAI Experience, Forward deployed engineering experience, Machine Learning Experience</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale develops AI systems for the world&apos;s most important decisions.</Employerdescription>
      <Employerwebsite>https://www.scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4642876005</Applyto>
      <Location>London, UK</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>94b058de-e59</externalid>
      <Title>Solutions Engineer, Enterprise</Title>
      <Description><![CDATA[<p>We are seeking a Solutions Engineer to join our team. As a Solutions Engineer, you will play a vital role in the development of AI applications. You will partner closely with Account Executives, Product, and Machine Learning Engineers to lead prospective customers through pre-sales, delivering customized demos and pilots to secure the “technical win”. You will scope customer technical requirements and develop an actionable Statement of Work. You will work closely with the delivery team to help with initial implementation.</p>
<p>You will be accountable for securing the “technical win” by unblocking technical challenges. You will interact with customers daily to understand their needs and design solutions to better serve them. You will design and develop “Scopes of Work” by breaking down customer challenges into a project plan. You will work closely with forward-deployed Software and Machine learning Engineers to develop agents in the initial post-sales stage. You will work with Account Executives and Project Managers to identify customer-specific feature requests. You will drive strategic initiatives to improve the efficiency and effectiveness of the Solution Engineering team.</p>
<p>Ideally, you&#39;d have a strong engineering background with prior experience working with clients in a pre or post-sales capacity to realise business goals. You should have prior experience developing with Python, Java and/or other web development languages. You should have experience working in enterprise SaaS, cloud tech, finance, fintech or similar industries in a technical capacity with end-customer engagement. You should have a track record as a self-starter, motivated to independently unblock technical issues in the field with the customer, away from the mothership. You should have presentation skills with a high degree of technical credibility when speaking with executives and front-line engineers. You should have a high level of comfort communicating effectively across internal and external organisations. You should have intellectual curiosity, empathy, and ability to operate with high velocity.</p>
<p>Nice to haves include GenAI Experience, Forward deployed engineering experience, and Machine Learning Experience.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,000-$225,000 USD</Salaryrange>
      <Skills>Python, Java, web development languages, GenAI, Machine Learning, enterprise SaaS, cloud tech, finance, fintech, GenAI Experience, Forward deployed engineering experience, Machine Learning Experience</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale develops reliable AI systems for the world&apos;s most important decisions.</Employerdescription>
      <Employerwebsite>https://www.scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4554440005</Applyto>
      <Location>San Francisco, CA; New York, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>73943189-eef</externalid>
      <Title>Senior Software Engineer - Internal Tools &amp; Productivity</Title>
      <Description><![CDATA[<p>We&#39;re hiring a highly skilled Senior Software Engineer to help design, build, and operate secure, scalable infrastructure that empowers our employees to do their best work.</p>
<p>As a full stack engineer, you will work with your team to build amazing tools/applications for internal company use as well as external customers/partners.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Design, develop, test, and support full-stack applications on cloud-native distributed systems</li>
<li>Build real-time integrations with SaaS platforms across the organisation</li>
<li>Build a quality framework/unit tests that ensure product quality, performance, and load and are able to debug/identify system issues</li>
<li>Collaborate with the broader teams, engage in engineering council, conduct code reviews, and improve our delivery process</li>
</ul>
<p>Collaborate with the IAM team to manage cloud-based identity (Okta) and access controls, ensuring compliance with security policies and standards for internal applications.</p>
<p>Work heavily cross-functionally to identify areas for corporate technology service improvement and implement self-service solutions.</p>
<p>Requirements include:</p>
<ul>
<li>5+ years of related experience with a Bachelor’s degree; or equivalent work experience</li>
<li>Proficiency with UI frameworks such as React or Angular and HTML, CSS, Typescript, etc</li>
<li>Proficiency with backend technologies, including API development, databases, and privacy permissions. Ex. Go, Python, PostgreSQL and GraphQL</li>
<li>Experience with end-to-end testing and documentation</li>
<li>Experience developing secure, scalable, and resilient applications on the cloud that handle sensitive data</li>
<li>Experience with cloud technologies, e.g., AWS, Azure, GCP, Docker, or Kubernetes</li>
<li>Familiarity with compliance frameworks such as SOC 2, ISO 27001, FedRAMP, and NIST.</li>
</ul>
<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits.</p>
<p>The base salary range for this full-time position in the location of San Francisco is: $216,000-$270,000 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$216,000-$270,000 USD</Salaryrange>
      <Skills>UI frameworks, backend technologies, API development, databases, privacy permissions, cloud technologies, compliance frameworks</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale AI</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale AI develops reliable AI systems for the world&apos;s most important decisions.</Employerdescription>
      <Employerwebsite>https://scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4654897005</Applyto>
      <Location>San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>5aa5b947-f4d</externalid>
      <Title>Staff Machine Learning Research Scientist/ Engineer, Agents</Title>
      <Description><![CDATA[<p>About Scale AI</p>
<p>At Scale AI, our mission is to accelerate the development of AI applications. This role is at the intersection of cutting-edge AI research and practical application, with a focus on studying the data types essential for building state-of-the-art agents.</p>
<p>Responsibilities</p>
<ul>
<li>Explore the data landscape needed to advance intelligent, adaptable AI agents, guiding the data strategy at Scale to drive innovation.</li>
<li>Contribute to impactful research publications on agents, collaborate with customer researchers, and work alongside the engineering team to translate these advancements into real-world, scalable solutions.</li>
</ul>
<p>Requirements</p>
<ul>
<li>Practical experience working with LLMs, with proficiency in frameworks like Pytorch, Jax, or Tensorflow.</li>
<li>A track record of published research in top ML venues (e.g., ACL, EMNLP, NAACL, NeurIPS, ICML, ICLR, COLM, etc.).</li>
<li>At least three years of experience addressing sophisticated ML problems, either in a research setting or product development.</li>
</ul>
<p>Nice to Have</p>
<ul>
<li>Hands-on experience with open source LLM fine-tuning or involvement in bespoke LLM fine-tuning projects using Pytorch/Jax.</li>
<li>Hands-on experience and publications in building applications and evaluations related to AI agents such as tool-use, text2SQL, browser agents, coding agents and GUI agents.</li>
<li>Hands-on experience with agent frameworks such as OpenHands, Swarm, LangGraph, etc.</li>
<li>Familiarity with agentic reasoning methods such as STaR and PLANSEARCH</li>
<li>Experience working with cloud technology stack (eg. AWS or GCP) and developing machine learning models in a cloud environment.</li>
</ul>
<p>Benefits</p>
<ul>
<li>Comprehensive health, dental and vision coverage</li>
<li>Retirement benefits</li>
<li>A learning and development stipend</li>
<li>Generous PTO</li>
<li>Commuter stipend</li>
</ul>
<p>Salary Range</p>
<p>$259,200-$324,000 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$259,200-$324,000 USD</Salaryrange>
      <Skills>Pytorch, Jax, Tensorflow, LLMs, Agent frameworks, Agentic reasoning methods, Cloud technology stack, Open source LLM fine-tuning, Bespoke LLM fine-tuning projects</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale AI</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale AI is a leading AI data foundry that provides high-quality data and full-stack technologies for the development of AI applications.</Employerdescription>
      <Employerwebsite>https://scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4488520005</Applyto>
      <Location>San Francisco, CA; Seattle, WA; New York, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8b8cbfe7-a98</externalid>
      <Title>Senior Software Engineer, Echo</Title>
      <Description><![CDATA[<p>Ready to be pushed beyond what you think you’re capable of?</p>
<p>At Coinbase, our mission is to increase economic freedom in the world.</p>
<p>We&#39;re seeking a very specific candidate who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system.</p>
<p>As a Senior Software Engineer on the Echo team, you will solve unique, large scale, highly complex technical problems, bridging the constraints posed by web-scale applications and blockchain technology.</p>
<p>You will help build the next generation of systems to make cryptocurrency accessible to everyone across the globe, operating real-time applications with high frequency, low latency updates, and managing the most secure, dockerized infrastructure running in the cloud.</p>
<p>The Echo team is responsible for two innovative products in the capital formation space; Echo and Sonar.</p>
<p>We are a small team that operate as a startup within the larger org, and we’re committed to shipping impactful product at a fast pace.</p>
<p>Echo, our marketplace for private investments has facilitated over 300 deals and $150m invested since 2024, and Sonar - our public sales and compliance platform - enables customers to run their own token sales.</p>
<p>Our engineering team works across the whole stack and is empowered to take ownership of large projects.</p>
<p>What you&#39;ll be doing:</p>
<ul>
<li>Build new services to meet critical product and business needs using Golang.</li>
</ul>
<ul>
<li>Design scalable systems to solve novel problems with modern cloud technology and industry best practices.</li>
</ul>
<ul>
<li>Articulate a long term vision for maintaining and scaling our backend systems and the teams running them.</li>
</ul>
<ul>
<li>Work with engineers, designers, product managers and senior leadership to turn our product and technical vision into a tangible roadmap every quarter.</li>
</ul>
<ul>
<li>Write high quality, well tested code to meet the needs of your customers.</li>
</ul>
<p>What we look for in you:</p>
<ul>
<li>You have at least 5 years of experience in software engineering.</li>
</ul>
<ul>
<li>You’ve designed, built, scaled and maintained production services, and know how to compose a service oriented architecture.</li>
</ul>
<ul>
<li>You write high quality, well tested code to meet the needs of your customers.</li>
</ul>
<ul>
<li>You’re passionate about building an open financial system that brings the world together.</li>
</ul>
<ul>
<li>Demonstrates the ability to responsibly use generative AI tools and copilots (e.g., LibreChat, Gemini, Glean) in daily workflows, continuously learn as tools evolve, and apply human-in-the-loop practices to deliver business-ready outputs and drive measurable improvements in efficiency, cost, and quality.</li>
</ul>
<p>Nice to haves:</p>
<ul>
<li>You have gone through a rapid growth in your company (from startup to mid-size).</li>
</ul>
<ul>
<li>Experience with growth experiments or A/B testing frameworks.</li>
</ul>
<ul>
<li>You have experience with Blockchain technology (such as Bitcoin, Ethereum etc..)</li>
</ul>
<ul>
<li>You have experience decomposing a large monolith into microservices.</li>
</ul>
<ul>
<li>You’ve worked with Golang, Ruby, Docker, Rails, Postgres, MongoDB or DynamoDB.</li>
</ul>
<ul>
<li>You’ve built financial, high reliability or security systems.</li>
</ul>
<p>Job #: (GB-CFBE05UK-Q126)</p>
<p>#LI-Remote</p>
<p>Pay Transparency Notice: The target annual base salary for this position can range as detailed below. Total compensation may also include equity and bonus eligibility and benefits (including medical, dental, and vision).</p>
<p>Annual base salary range (excluding equity and bonus):</p>
<p>£122,400-£136,000 GBP</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>£122,400-£136,000 GBP</Salaryrange>
      <Skills>Golang, Cloud technology, Service-oriented architecture, Blockchain technology, Generative AI tools and copilots, Ruby, Docker, Rails, Postgres, MongoDB, DynamoDB</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Coinbase</Employername>
      <Employerlogo>https://logos.yubhub.co/coinbase.com.png</Employerlogo>
      <Employerdescription>Coinbase is a digital currency exchange and wallet service that allows users to buy, sell, and store cryptocurrencies such as Bitcoin, Ethereum, and Litecoin.</Employerdescription>
      <Employerwebsite>https://www.coinbase.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/coinbase/jobs/7569402</Applyto>
      <Location>Remote - UK</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>9bb1344c-662</externalid>
      <Title>Sr. Solutions Engineer, Retail - CPG</Title>
      <Description><![CDATA[<p>We are looking for a Senior Solutions Engineer to join our team. As a Senior Solutions Engineer, you will work with large enterprises in the Retail and CPG space to help them become more data-driven. You will define and direct the technical strategy for our largest and most important accounts, leading to more widespread use of our products and wider and deeper adoption of ML &amp; AI.</p>
<p>You will work closely with the Account Executive to develop and execute a technical strategy that aligns with the customer&#39;s goals and objectives. You will also work with a team of engineers to build proofs of concept and demonstrate our products.</p>
<p>The ideal candidate will have a strong background in value selling, technical account management, and technical leadership. They will also have a solid understanding of big data, data science, and cloud technologies.</p>
<p>Responsibilities:</p>
<ul>
<li>Define and direct the technical strategy for our largest and most important accounts</li>
<li>Work closely with the Account Executive to develop and execute a technical strategy that aligns with the customer&#39;s goals and objectives</li>
<li>Collaborate with a team of engineers to build proofs of concept and demonstrate our products</li>
<li>Provide technical guidance and support to customers</li>
<li>Work with customers to identify and address technical issues</li>
</ul>
<p>Requirements:</p>
<ul>
<li>5+ years of experience working with large enterprises in the Retail and CPG space</li>
<li>3+ years of experience in a pre-sales capacity or supporting sales activity</li>
<li>Strong background in value selling, technical account management, and technical leadership</li>
<li>Solid understanding of big data, data science, and cloud technologies</li>
<li>Experience with design and implementation of big data technologies such as Hadoop, NoSQL, MPP, OLTP, and OLAP</li>
<li>Production programming experience in Python, R, Scala, or Java</li>
</ul>
<p>Nice to have:</p>
<ul>
<li>Databricks Certification</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>big data, data science, cloud technologies, Hadoop, NoSQL, MPP, OLTP, OLAP, Python, R, Scala, Java, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI. It has over 10,000 organisations worldwide as customers.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/7507778002</Applyto>
      <Location>Illinois</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>5196c4ac-d97</externalid>
      <Title>Senior Software Engineer - Infrastructure and Tools</Title>
      <Description><![CDATA[<p>We are seeking a Senior Software Engineer to join our Infrastructure teams. As a key member of our team, you will build scalable systems to power the Databricks platform, making it the de-facto platform for running Big Data and AI workloads.</p>
<p>Your responsibilities will include building and extending components of the core Databricks infrastructure, architecting multi-cloud systems and abstractions to allow the Databricks product to run on top of existing Cloud providers, improving software development workflows for engineering and operational efficiency, using our own data and AI platform to analyze build and test logs and metrics to identify areas for improvement, developing automated build, test, and release infrastructures, and setting and upholding the standard for engineering processes to support high-quality engineering.</p>
<p>To succeed in this role, you will need a BS (or higher) in Computer Science, or a related field, and 5+ years of experience writing production code in one of Java, Scala, Go, C++, or Python. You should also have passion for building highly scalable and reliable infrastructure, experience architecting, developing, and deploying large-scale distributed systems at scale, and experience with cloud APIs and cloud technologies such as AWS, Azure, GCP, Docker, Kubernetes, or Terraform.</p>
<p>In addition to a competitive salary, we offer comprehensive health coverage, 401(k) plan, equity awards, flexible time off, paid parental leave, family planning, gym reimbursement, annual personal development fund, work headphones reimbursement, employee assistance program, and business travel accident insurance.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$166,000-$225,000 USD</Salaryrange>
      <Skills>Java, Scala, Go, C++, Python, Cloud APIs, Cloud technologies, AWS, Azure, GCP, Docker, Kubernetes, Terraform</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/6318503002</Applyto>
      <Location>San Francisco, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>fdc6f0f9-900</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, distributed computing, CI/CD, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461168002</Applyto>
      <Location>Los Angeles, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0fb2e339-447</externalid>
      <Title>Enterprise Hunter Account Executive (FSI - North)</Title>
      <Description><![CDATA[<p>As an Enterprise Account Executive in Databricks, you will be responsible for selling the company&#39;s enterprise cloud data platform powered by Apache Spark to financial services institutions in India. Your goal will be to close new accounts while maintaining existing ones, and to exceed activity, pipeline, and revenue targets.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Presenting a territory plan within the first 90 days</li>
<li>Meeting with CIOs, IT executives, LOB executives, program managers, and other important partners</li>
<li>Closing both new accounts and existing accounts</li>
<li>Identifying and closing quick, small wins while managing longer, complex sales cycles</li>
<li>Exceeding activity, pipeline, and revenue targets</li>
<li>Tracking all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce</li>
</ul>
<p>To succeed in this role, you will need to have 7+ years of experience in enterprise sales, with a proven track record of exceeding quotas and closing new accounts. You should also have a strong understanding of cloud technologies and be able to articulate intricate concepts simply.</p>
<p>In addition to your technical skills, you will need to be a strong communicator and be able to build relationships with key decision-makers. You should also be comfortable working in a fast-paced environment and be able to adapt to changing priorities.</p>
<p>If you are a motivated and results-driven sales professional who is looking for a new challenge, we encourage you to apply for this role.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise sales, Cloud technologies, Apache Spark, Salesforce, Customer relationship building, Big data, Data analytics, Artificial intelligence</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. Over 10,000 organizations worldwide rely on its platform.</Employerdescription>
      <Employerwebsite>https://databricks.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8438952002</Applyto>
      <Location>Delhi, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>2378a863-cea</externalid>
      <Title>Enterprise Account Executive - Financial Services</Title>
      <Description><![CDATA[<p>As an Enterprise Account Executive at Databricks, you will be a strategic sales professional experienced in selling into Financial Services accounts. You will know how to sell innovation and change through customer vision expansion and guide deals forward to compress decision cycles.</p>
<p>You will love understanding a product in depth and be passionate about communicating its value to Customers and System Integrators. You will always be looking for new opportunities and be asked to grow within existing accounts.</p>
<p>Along with the chance to close an exciting deal, we also offer accelerators above 100% quota attainment.</p>
<p>Your impact will be:</p>
<ul>
<li>Meet with CIOs, IT executives, LOB executives, Program Managers, and other important partners</li>
<li>Close both new accounts and existing accounts</li>
<li>Identify and close quick, small wins while managing longer, complex sales cycles</li>
<li>Exceed activity, pipeline, and revenue targets</li>
<li>Track all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce</li>
<li>Use a solution-based approach to selling and creating value for customers</li>
<li>Promote Databricks&#39; enterprise cloud data platform powered by Apache Spark™</li>
<li>Ensure 100% satisfaction among all customers</li>
<li>Prioritize opportunities and applying appropriate resources</li>
<li>Build a plan for success internally at Databricks and externally with your accounts</li>
</ul>
<p>We look for:</p>
<ul>
<li>You have previously worked in an early stage company and you know how to navigate and be successful</li>
<li>Field sales experience within big data, Cloud, or SaaS sales</li>
<li>Prior customer relationships with CIOs, program managers, and essential decision makers</li>
<li>Simply articulate intricate cloud technologies</li>
<li>7+ years experience exceeding sales quotas</li>
<li>Expertise with financial services institutions preferable</li>
<li>Success closing new accounts while working existing accounts</li>
<li>Understanding of Spark and big data preferable</li>
<li>Passion for cloud technologies</li>
<li>Bachelor&#39;s Degree</li>
</ul>
<p>Pay Range Transparency: Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $272,000-$374,000 USD Zone 2 Pay Range $272,000-$374,000 USD Zone 3 Pay Range $272,000-$374,000 USD Zone 4 Pay Range $272,000-$374,000 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$272,000-$374,000 USD</Salaryrange>
      <Skills>Field sales experience within big data, Cloud, or SaaS sales, Prior customer relationships with CIOs, program managers, and essential decision makers, Simply articulate intricate cloud technologies, 7+ years experience exceeding sales quotas, Expertise with financial services institutions preferable, Understanding of Spark and big data, Passion for cloud technologies</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8480190002</Applyto>
      <Location>New York City, New York; Remote - Washington D.C.</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>45cde3e1-29d</externalid>
      <Title>Applied AI Engineer, Enterprise GenAI</Title>
      <Description><![CDATA[<p>We&#39;re looking for an Applied AI Engineer to join our Enterprise Engineering team. As an Applied AI Engineer, you&#39;ll work with clients to create ML solutions to satisfy their business needs. Your work will range from building next-generation AI cybersecurity firewalls to creating transformative AI experiences in journalism to applying foundation genomic models making predictions about life-saving drug proteins.</p>
<p>Daily data-driven experiments will provide key insights around model strengths and inefficiencies which you&#39;ll use to improve your product&#39;s performance. You&#39;ll own, plan, and optimize the AI behind our Enterprise customer&#39;s deepest technical problems, leveraging our Scale Generative Platform (SGP) to build the most advanced AI agents across the industry.</p>
<p>Responsibilities:</p>
<ul>
<li>Own, plan, and optimize the AI behind our Enterprise customer&#39;s deepest technical problems</li>
<li>Leverage SGP to build the most advanced AI agents across the industry including multimodal functionality, tool-calling, and more</li>
<li>Have experience gathering business requirements and translating them into technical solutions</li>
<li>Meet regularly with customer teams onsite and virtually, collaborating cross-functionally with all teams responsible for their data and ML needs</li>
<li>Push production code in multiple development environments, writing and debugging code directly in both our customer&#39;s and Scale&#39;s codebases.</li>
</ul>
<p>Ideal candidate will have a love for solving deeply complex technical problems with ambiguity using state of the art research and AI to accomplish your client&#39;s business goals, a strong engineering background, deep familiarity with a data-driven approach when iterating on machine learning models, and experience working with cloud technology stack and developing machine learning models in a cloud environment.</p>
<p>Nice to have: strong knowledge of software engineering best practices, experience building applications taking advantage of Generative AI in real, production use cases, and familiarity with state of the art LLMs and their strengths/weaknesses.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$216,000-$270,000 USD</Salaryrange>
      <Skills>Python, Machine Learning, Cloud Technology Stack, Data-Driven Approach, Software Engineering Best Practices, Generative AI, State of the Art LLMs, Multimodal Functionality, Tool-Calling</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale is an AI data foundry that helps fuel advancements in AI, including generative AI, defense applications, and autonomous vehicles.</Employerdescription>
      <Employerwebsite>https://scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4514173005</Applyto>
      <Location>San Francisco, CA; New York, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c1bcd7d3-b33</externalid>
      <Title>Software Engineer, Fullstack (Omnichannel)</Title>
      <Description><![CDATA[<p>About Dialpad</p>
<p>Dialpad is the AI-native business communications platform. We unify calling, messaging, meetings, and contact center on a single platform - powered by AI that understands every conversation in real time.</p>
<p>We&#39;re seeking a talented and experienced Software Full-Stack Engineer passionate about building high-quality, scalable web applications using modern frontend &amp; backend technologies to build the next generation of our omnichannel Contact Center products.</p>
<p>Responsibilities</p>
<ul>
<li>Develop and maintain Dialpad&#39;s web applications using modern front-end and back-end technologies.</li>
<li>Provide estimates on technical resources and requirements necessary to plan and begin projects.</li>
<li>Take responsibility for executing projects in the omnichannel contact center communications space. Assist and drive, as needed, to ensure the team meets its delivery milestones.</li>
<li>Develop well-tested features with appropriate test hooks, resulting in low defect reports and faster engineering throughput.</li>
<li>Review technical designs to ensure features/products are well-integrated and fully meet business needs.</li>
<li>Participate in code reviews, design discussions, and other team activities to ensure high-quality software delivery.</li>
<li>Troubleshoot and debug issues with existing features, as needed.</li>
<li>Stay up to date with the latest backend platform technologies and best practices, and contribute to the continuous improvement of our engineering processes and tools.</li>
<li>Ensure features are shipped on time and to the highest quality standards.</li>
</ul>
<p>Requirements</p>
<ul>
<li>5+ years of strong experience in full-stack software engineering.</li>
<li>Bachelor’s or Master’s degree in Computer Science or related field, or equivalent experience.</li>
<li>Leverage AI Tools (Claude / Windsurf / Gemini) for development.</li>
<li>Strong experience working with HTML/CSS, Vue.js, Typescript, Python, Java.</li>
<li>Strong experience working with Cloud Technologies [Google Cloud Platform is a plus] and distributed technologies.</li>
<li>Working knowledge of unit test and integration test frameworks.</li>
<li>Good understanding of web technologies, RESTful APIs, and web application frameworks.</li>
<li>Experience with performance and optimization problems and a demonstrated ability to both diagnose and prevent them.</li>
<li>Strong debugging and troubleshooting skills.</li>
<li>Strong communication and collaboration skills.</li>
<li>Experience with highly agile and iterative development processes.</li>
</ul>
<p>Why Join Dialpad</p>
<ul>
<li>Work at the center of the AI transformation in business communications</li>
<li>Build and ship agentic AI products that are redefining how companies operate</li>
<li>Join a team where AI amplifies every employee’s impact</li>
<li>Competitive salary, comprehensive benefits, and real opportunities for growth</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>AI Tools (Claude / Windsurf / Gemini), HTML/CSS, Vue.js, Typescript, Python, Java, Cloud Technologies (Google Cloud Platform), distributed technologies, unit test and integration test frameworks, web technologies, RESTful APIs, web application frameworks, performance and optimization problems, debugging and troubleshooting skills, communication and collaboration skills, agile and iterative development processes</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Dialpad</Employername>
      <Employerlogo>https://logos.yubhub.co/dialpad.com.png</Employerlogo>
      <Employerdescription>Dialpad is the AI-native business communications platform, serving over 70,000 companies worldwide.</Employerdescription>
      <Employerwebsite>https://dialpad.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/dialpad/jobs/8407077002</Applyto>
      <Location>Buenos Aires, Argentina</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>90423d85-ea7</externalid>
      <Title>Senior Software Engineer - Fullstack</Title>
      <Description><![CDATA[<p>As a Full Stack software engineer, you will work with your team and product management to make insights from data simple. We are looking for engineers that are customer obsessed, who can take on the full scope of the product and user experience beyond the technical implementation. You&#39;ll set the foundation for how we build robust, scalable and delightful products.</p>
<p>Some example experiences you&#39;ll create for our customers to achieve the full project lifecycle from loading data, visualizing results, creating statistical models, and deploying as production artifacts include:</p>
<ul>
<li>Simple workflows to create, configure, and manage large-scale compute clusters, networks and data sources.</li>
<li>Create, deploy, test, and upgrade complex data pipelines with powerful features to visualize data graphs.</li>
<li>Seamless onboarding and management for all members of an organisation to become data-driven.</li>
<li>Provide a great SQL-centric data exploration and dashboarding experience on Databricks.</li>
<li>An interactive environment for collaborative data projects at massive scale with an easy path to production.</li>
</ul>
<p>We are looking for engineers with 5+ years of experience with HTML, CSS, and JavaScript, passion for user experience and design, and a deep understanding of front-end architecture. You should be comfortable working towards a multi-year vision with incremental deliverables, motivated by delivering customer value, and experienced with modern JavaScript frameworks and server-side web technologies.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$166,000-$225,000 USD</Salaryrange>
      <Skills>HTML, CSS, JavaScript, SQL, Cloud technologies (AWS, Azure, GCP, Docker, or Kubernetes), Modern JavaScript frameworks (React, Angular, or VueJs/Ember), Server-side web technologies (Node.js, Java, Python, Scala, C#, C++, Go)</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks builds and runs the world&apos;s best Data Intelligence Platform, serving over 10,000 organisations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/5445641002</Applyto>
      <Location>Mountain View, California; San Francisco, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>5ceb4835-0f1</externalid>
      <Title>Manager, Professional Services</Title>
      <Description><![CDATA[<p>As a Manager, Professional Services, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers get the most value out of their data.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical big data projects which may include building reference architectures, how-to&#39;s, and production-grade MVPs.</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build, and deployment of industry-leading big data and AI applications.</li>
<li>Consult on architecture and design; bootstrap or implement strategic customer projects which lead to a customer&#39;s successful understanding, evaluation, and adoption of Databricks.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement-specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>10+ years of experience with Big Data Technologies such as Apache Spark, Kafka, Cloud Native, and Data Lakes in a customer-facing post-sales, technical architecture, or consulting role.</li>
<li>4+ years of people management experience, managing a team of Data Engineers, Data Architects, etc.</li>
<li>6+ years of experience working on Big Data Architectures independently.</li>
<li>Experience working across Cloud Platforms (GCP/AWS/Azure).</li>
<li>Experience working on Databricks platform is a plus.</li>
<li>Documentation and white-boarding skills.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Willingness to travel for onsite customer engagements within India.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Apache Spark, Kafka, Cloud Native, Data Lakes, Big Data Technologies, Data Engineering, Data Science, Cloud Technology, People Management, Team Leadership, Databricks, GCP, AWS, Azure, Documentation, White-boarding</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform to over 10,000 organizations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8503068002</Applyto>
      <Location>Remote - India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>d1ee6aec-ec7</externalid>
      <Title>Applied AI Engineer, Enterprise</Title>
      <Description><![CDATA[<p>We&#39;re looking for an Applied AI Engineer to join our Enterprise Engineering team. As an Applied AI Engineer, you&#39;ll work with clients to create ML solutions to satisfy their business needs. Your work will range from building next-generation AI cybersecurity firewalls to creating transformative AI experiences in journalism to applying foundation genomic models making predictions about life-saving drug proteins.</p>
<p>Daily data-driven experiments will provide key insights around model strengths and inefficiencies which you&#39;ll use to improve your product&#39;s performance. You&#39;ll own, plan, and optimize the AI behind our Enterprise customer&#39;s deepest technical problems, leveraging our Scale Generative Platform (SGP) to build the most advanced AI agents across the industry.</p>
<p>Responsibilities:</p>
<ul>
<li>Own, plan, and optimize the AI behind our Enterprise customer&#39;s deepest technical problems</li>
<li>Leverage SGP to build the most advanced AI agents across the industry, including multimodal functionality, tool-calling, and more</li>
<li>Have experience gathering business requirements and translating them into technical solutions</li>
<li>Meet regularly with customer teams onsite and virtually, collaborating cross-functionally with all teams responsible for their data and ML needs</li>
<li>Push production code in multiple development environments, writing and debugging code directly in both our customer&#39;s and Scale&#39;s codebases</li>
</ul>
<p>Ideally, you&#39;d have:</p>
<ul>
<li>A love for solving deeply complex technical problems with ambiguity using state of the art research and AI to accomplish your client&#39;s business goals</li>
<li>Strong engineering background: a Bachelor’s degree in Computer Science, Mathematics, or another quantitative field or equivalent strong engineering background</li>
<li>Deep familiarity with a data-driven approach when iterating on machine learning models and how changes in datasets can influence model results</li>
<li>Experience working with cloud technology stack (eg. AWS or GCP) and developing machine learning models in a cloud environment</li>
<li>Proficiency in Python to write, test and debug code using common libraries (ie numpy, pandas)</li>
</ul>
<p>Nice to haves:</p>
<ul>
<li>Strong knowledge of software engineering best practices</li>
<li>Have built applications taking advantage of Generative AI in real, production use cases</li>
<li>Familiarity with state of the art LLMs and their strengths/weaknesses</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, Machine Learning, Cloud Technology Stack, Data-Driven Approach, Software Engineering Best Practices, Generative AI, State of the Art LLMs</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale is an AI data foundry that helps fuel advancements in AI, including generative AI, defense applications, and autonomous vehicles.</Employerdescription>
      <Employerwebsite>https://scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4536659005</Applyto>
      <Location>London, UK</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0a7cad02-cd5</externalid>
      <Title>Resident Solutions Architect - Manufacturing</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect (RSA) on our Professional Services team, you will work with customers on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>The impact you will have:</p>
<ul>
<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues</li>
</ul>
<ul>
<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines</li>
</ul>
<ul>
<li>Documentation and white-boarding skills</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
</ul>
<ul>
<li>Ability to travel up to 30% when needed</li>
</ul>
<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494155002</Applyto>
      <Location>Philadelphia, Pennsylvania</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>de488dbe-d18</externalid>
      <Title>Vice President, Presales Solutions</Title>
      <Description><![CDATA[<p>We are seeking a Vice President, Presales Solutions to lead our presales team in defining market-winning strategies that align with our overarching business objectives. As a key member of our revenue team, you will partner with our growth and retention sales team to ensure that our solutions are properly scoped, architected, and presented to align with customer needs and requirements.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Lead and manage the team that partners with Sales (Growth and Retention) to define market-winning strategies that align with Komodo&#39;s overarching business objectives.</li>
</ul>
<ul>
<li>Lead and manage the team that acts as technical diplomats to C-suite stakeholders, translating business hurdles into actionable solutions using Komodo&#39;s suite of products and Healthcare Map.</li>
</ul>
<ul>
<li>Lead and manage the team that oversees the creation of customer-specific technical solutions that address requirements and long-term goals.</li>
</ul>
<ul>
<li>Collaborate with Product, Sales, Marketing, and other teams to craft and deliver compelling value propositions and demonstrations.</li>
</ul>
<ul>
<li>Lead and manage the team that provides thought leadership on the application of analytics and healthcare data to drive measurable customer success.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>15+ years of professional experience with a significant focus on high-growth SaaS, analytics, or AI environments.</li>
</ul>
<ul>
<li>8+ years of leadership experience, specifically having scaled a presales and/or technical consulting organization, ideally from an earlier stage to a multi-layered enterprise team.</li>
</ul>
<ul>
<li>Strategic Breadth: Proven ability to influence internal product direction and external customer strategy simultaneously.</li>
</ul>
<ul>
<li>Domain Knowledge: Experience navigating the Life Sciences and/or Healthcare space.</li>
</ul>
<ul>
<li>Technical Literacy: Awareness of data science tools, cloud technologies, and AL/ML applications.</li>
</ul>
<p>Expectations of AI Use in this role:</p>
<ul>
<li>Applies AI strategically, knowing how to use the right tools at the right time to solve problems or generate new ideas.</li>
</ul>
<ul>
<li>Uses AI to accelerate feedback loops, test ideas, and make informed decisions faster.</li>
</ul>
<ul>
<li>Applies both judgment and creativity when working with AI and understands its limitations.</li>
</ul>
<ul>
<li>Educates and enables others in the proper use of AI in the pre-sales domain.</li>
</ul>
<ul>
<li>Demonstrates curiosity and initiative in exploring new AI tools and workflows relevant to their role.</li>
</ul>
<ul>
<li>Has developed or shared AI-enhanced workflows, automations, or prompts with others.</li>
</ul>
<ul>
<li>Brings a reflexive AI mindset and uses AI as a natural part of their process to iterate, research, and deliver better work.</li>
</ul>
<ul>
<li>Comfortable working in AI-enhanced environments across writing, data analysis, content creation, or design.</li>
</ul>
<ul>
<li>Regularly leverages AI tools such as Gemini, Cursor, Perplexity, Claude, ChatGPT, or Microsoft Copilot to improve speed, clarity, or creative output.</li>
</ul>
<p>This position may be eligible for performance-based bonuses as determined in the Company&#39;s sole discretion and in accordance with a written agreement or plan. This role may also be eligible for equity awards. In addition, this role is eligible for benefits including, but not limited to, comprehensive health, dental, and vision insurance; flexible time off and holidays; 401(k) with company match; disability insurance and life insurance; and leaves of absence in accordance with applicable state and local laws and regulations and company policy.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$230,000-$311,000 USD</Salaryrange>
      <Skills>high-growth SaaS, analytics, AI environments, leadership experience, presales and/or technical consulting organization, data science tools, cloud technologies, AL/ML applications, Gemini, Cursor, Perplexity, Claude, ChatGPT, Microsoft Copilot</Skills>
      <Category>Engineering</Category>
      <Industry>Healthcare</Industry>
      <Employername>Komodo Health</Employername>
      <Employerlogo>https://logos.yubhub.co/komodohealth.com.png</Employerlogo>
      <Employerdescription>Komodo Health is a healthcare technology company that aims to reduce the global burden of disease by providing a platform for healthcare data analysis.</Employerdescription>
      <Employerwebsite>https://www.komodohealth.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/komodohealth/jobs/8439300002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>4c690d7e-4e0</externalid>
      <Title>Enterprise Account Executive - China</Title>
      <Description><![CDATA[<p>As an Enterprise Account Executive in Databricks, you will be responsible for selling to Enterprise accounts in the Greater China region. You will need to have experience in selling to CIOs, IT executives, LOB executives, program managers, and other important partners. Your goal will be to close both new accounts and existing accounts, identify and close quick, small wins while managing longer, complex sales cycles, exceed activity, pipeline, and revenue targets, track all customer details, and use a solution-based approach to selling and creating value for customers.</p>
<p>Your responsibilities will include presenting a territory plan within the first 90 days, meeting with CIOs, IT executives, LOB executives, program managers, and other important partners, closing both new accounts and existing accounts, identifying and closing quick, small wins while managing longer, complex sales cycles, exceeding activity, pipeline, and revenue targets, tracking all customer details, including use case, purchase time frames, next steps, and forecasting in Salesforce, using a solution-based approach to selling and creating value for customers, promoting Databricks&#39; enterprise cloud data platform powered by Apache Spark, ensuring 100% satisfaction among all customers, prioritising opportunities and applying appropriate resources, and building a plan for success internally at Databricks and externally with your accounts.</p>
<p>To be successful in this role, you will need to have field sales experience within big data, Cloud, and SaaS sales, covering the Greater China territory, prior customer relationships with CIOs, program managers, and essential decision makers, the ability to simply articulate intricate cloud technologies, 7+ years of Enterprise Sales experience exceeding quotas, covering relevant accounts and industries, success in closing new accounts while working on existing accounts, understanding of Spark and big data, business proficiency in Mandarin, and experience in the GCR territory.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Field sales experience within big data, Cloud, and SaaS sales, Prior customer relationships with CIOs, program managers, and essential decision makers, Ability to simply articulate intricate cloud technologies, 7+ years of Enterprise Sales experience exceeding quotas, Understanding of Spark and big data, Business proficiency in Mandarin, Experience in the GCR territory</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. It has over 10,000 organisations worldwide as clients.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8470125002</Applyto>
      <Location>Remote - China</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>25abb310-047</externalid>
      <Title>Manager, Field Engineering</Title>
      <Description><![CDATA[<p>As a Manager, Field Engineering, you will lead a diverse team of technical pre-sales Solutions Architects covering the Asean region.</p>
<p>You’ll play a key leadership role at the intersection of technology and business,guiding your team to design impactful solutions and drive growth through collaboration and innovation.</p>
<p>You’ll empower your team to communicate complex ideas with clarity, support strategic enterprise customer engagements, and build lasting partnerships with clients and internal stakeholders.</p>
<p>The impact you will have:</p>
<ul>
<li>Lead, mentor, and support a high-performing, inclusive pre-sales team</li>
</ul>
<ul>
<li>Foster an environment of belonging, continuous learning, and psychological safety that reflects Databricks’ values of customer focus, teamwork, and diversity.</li>
</ul>
<ul>
<li>Work closely with Sales and other teams to identify opportunities and deliver data-driven, value-focused solutions.</li>
</ul>
<ul>
<li>Establish best practices that improve team efficiency, collaboration, and business impact.</li>
</ul>
<ul>
<li>Build trusted relationships with customers and partners, acting as a strategic advisor in their digital transformation journey.</li>
</ul>
<ul>
<li>Partner with Marketing, Sales, Services, and other cross-functional teams to ensure a seamless customer experience.</li>
</ul>
<ul>
<li>Represent Databricks as part of the regional leadership team, contributing to our local presence and inclusive culture.</li>
</ul>
<p>What we’re looking for:</p>
<p>You do not need to fulfill every single requirement to be a strong candidate. If you are excited about this role and have related experience, we encourage you to apply.</p>
<ul>
<li>Proven experience leading or managing technical teams in Big Data, Cloud, or SaaS environments,or equivalent experience gained through hands-on engineering or consulting work.</li>
</ul>
<ul>
<li>A track record of coaching and developing diverse, high-performing teams.</li>
</ul>
<ul>
<li>Strong collaboration skills and the ability to partner effectively with Sales, Marketing, and other cross-functional leaders.</li>
</ul>
<ul>
<li>Excellent communication and interpersonal skills</li>
</ul>
<ul>
<li>Technical background in Data Engineering, Databases, or Data Science (consulting or customer-facing experience preferred).</li>
</ul>
<ul>
<li>Enthusiasm for data, AI, and cloud technologies, and the ability to connect these to real business outcomes.</li>
</ul>
<p>About Databricks:</p>
<p>Databricks is the data and AI company. More than 10,000 organizations worldwide , including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 , rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI.</p>
<p>To learn more, follow Databricks on Twitter, LinkedIn and Facebook.</p>
<p>Benefits:</p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.</p>
<p>Our Commitment to Diversity and Inclusion:</p>
<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.</p>
<p>Compliance:</p>
<p>If access to export-controlled technology or source code is required for performance of job duties, it is within Employer’s discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Big Data, Cloud, SaaS, Data Engineering, Databases, Data Science, Technical leadership, Team management, Collaboration, Communication, Data, AI, and cloud technologies, Business outcomes</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. It Dont have information about the company size/scale.</Employerdescription>
      <Employerwebsite>https://databricks.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8438767002</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>fc79e6e5-5c0</externalid>
      <Title>Resident Solutions Architect - Manufacturing</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect (RSA) on our Professional Services team, you will work with customers on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>The impact you will have:</p>
<ul>
<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues</li>
</ul>
<ul>
<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines</li>
</ul>
<ul>
<li>Documentation and white-boarding skills</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
</ul>
<ul>
<li>Ability to travel up to 30% when needed</li>
</ul>
<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, Data engineering, Data science, Cloud technology</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494156002</Applyto>
      <Location>Seattle, Washington</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>a1ccc8c6-f09</externalid>
      <Title>Geo Hunter Account Executive, Manufacturing &amp; High-Tech</Title>
      <Description><![CDATA[<p>As a Geo Hunter Enterprise Account Executive at Databricks, you will be responsible for selling into and activating Large Manufacturing accounts. You will be a strategic sales professional with experience in selling innovation and change through customer vision expansion. Your goal will be to guide deals forward to compress decision cycles and close exciting deals. We offer accelerators above 100% quota attainment.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Meeting with CIOs, IT executives, LOB executives, Program Managers, and other important partners</li>
<li>Closing both new accounts and existing accounts</li>
<li>Identifying and closing quick, small wins while managing longer, complex sales cycles</li>
<li>Exceeding activity, pipeline, and revenue targets</li>
<li>Tracking all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce</li>
<li>Using a solution-based approach to selling and creating value for customers</li>
<li>Promoting Databricks&#39; enterprise cloud data platform powered by Apache Spark</li>
<li>Ensuring 100% satisfaction among all customers</li>
<li>Prioritizing opportunities and applying appropriate resources</li>
<li>Building a plan for success internally at Databricks and externally with your accounts</li>
</ul>
<p>We are looking for someone with:</p>
<ul>
<li>Previous experience in an early-stage company and knowledge of how to navigate and be successful</li>
<li>Field sales experience within big data, Cloud, or SaaS sales</li>
<li>Experience managing large, complex Manufacturing accounts is preferred</li>
<li>Prior customer relationships with CIOs, program managers, and essential decision makers</li>
<li>Ability to simply articulate intricate cloud technologies</li>
<li>5+ years experience exceeding sales quotas</li>
<li>Success closing new accounts while working existing accounts</li>
<li>Understanding of Spark and big data preferable</li>
<li>Passion for cloud technologies</li>
<li>Bachelor&#39;s Degree</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$167,100-$229,800 USD</Salaryrange>
      <Skills>big data, Cloud, SaaS sales, sales quotas, Spark, Apache Spark, Delta Lake, MLflow, cloud technologies, customer vision expansion, solution-based approach, customer satisfaction</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8193347002</Applyto>
      <Location>Northeast - United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>f79572c2-264</externalid>
      <Title>Technical Support Engineer</Title>
      <Description><![CDATA[<p>The Technical Support Engineer prolet acts as a Starburst SME for a book of Majors and Strategic accounts. The role involves providing support for standard and custom deployments, answering technical questions, and assisting with supported LTS upgrades. The engineer will also be responsible for peer training and development, personal continued education, and contributing to reference documentation.</p>
<p>Responsibilities:</p>
<ul>
<li>Provide support for standard and custom deployments</li>
<li>Answer break/fix and non-break/fix technical questions through SFDC ticketing system</li>
<li>Efficiently reproduce reported issues by leveraging tools (minikube, minitrino, docker-compose, etc.), identify root causes, and provide solutions</li>
<li>Open SEP and Galaxy bug reports in Jira and feature requests in Aha!</li>
</ul>
<p>LTS Upgrades:</p>
<ul>
<li>Provide upgrade support upon customer request</li>
<li>Customer must be on a supported LTS version at the time of request</li>
<li>TSE must communicate unsupported LTS requests to the Account team as these require PS services</li>
</ul>
<p>Monthly Technical check-ins</p>
<ul>
<li>Conduct regularly scheduled technical check-ins with each BU</li>
<li>Discuss open support tickets, provide updates on product bugs and provide best practice recommendations based on your observations and ticket trends</li>
</ul>
<ul>
<li>Responsible for ensuring customer environments are on supported LTS versions</li>
</ul>
<p>Knowledge Sharing/Technical Enablement:</p>
<ul>
<li>Knowledge exchange and continued technical enablement are crucial for the development of our team and the customer experience</li>
<li>It&#39;s essential that we keep our product expertise and documentation current and that all team members have access to information</li>
</ul>
<ul>
<li>Contribute to our reference documentation</li>
<li>Lead peer training</li>
<li>Consultant to our content teams</li>
<li>Own your personal technical education journey</li>
</ul>
<p>Project Involvement</p>
<ul>
<li>Contribute to or drive components of departmental and cross-functional initiatives</li>
</ul>
<p>Partner with Leadership</p>
<ul>
<li>Identify areas of opportunity with potential solutions for inefficiencies or obstacles within the team and cross-functionally</li>
<li>Provide feedback to your manager on continued ed. opportunities, project ideas, etc.</li>
</ul>
<p>Requirements</p>
<ul>
<li>5+ years of support experience</li>
<li>3+ years of Big Data, Docker, Kubernetes and cloud technologies experience</li>
<li>Ability to Travel: This role will require 25% in-person travel for purposes including but not limited to new hire onboarding, team and department offsites, customer engagements, and other company events</li>
</ul>
<p>Skills</p>
<ul>
<li>Big Data (Hadoop, Data Lakes, Spark)</li>
<li>Docker and Kubernetes</li>
<li>Cloud technologies (AWS, Azure, GCP)</li>
<li>Security - Authentication (LDAP, OAuth2.0) and Authorization technologies</li>
<li>SSL/TLS</li>
<li>Linux Skills</li>
<li>DBMS Concepts/SQL Exposure Languages: SQL, Java, Python, Bash</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Big Data, Docker, Kubernetes, Cloud technologies, Security, Linux Skills, DBMS Concepts</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Starburst</Employername>
      <Employerlogo>https://logos.yubhub.co/starburst.io.png</Employerlogo>
      <Employerdescription>Starburst is a data platform company that provides analytics, applications, and AI services. It has customers in over 60 countries.</Employerdescription>
      <Employerwebsite>https://www.starburst.io/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/starburst/jobs/5124882008</Applyto>
      <Location>Hyderabad, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>40fb0673-f64</externalid>
      <Title>Technical Support Engineer</Title>
      <Description><![CDATA[<p>The Technical Support Engineer acts as a Starburst SME for a book of Majors and Strategic accounts. The role involves providing support for standard and custom deployments, answering technical questions, and assisting with supported LTS upgrades. The TSE is also responsible for peer training and development, personal continued education, and contributing to our reference documentation.</p>
<p>Responsibilities:</p>
<ul>
<li>Provide support for standard and custom deployments</li>
<li>Answer break/fix and non-break/fix technical questions through SFDC ticketing system</li>
<li>Efficiently reproduce reported issues by leveraging tools (minikube, minitrino, docker-compose, etc.), identify root causes, and provide solutions</li>
<li>Open SEP and Galaxy bug reports in Jira and feature requests in Aha!</li>
</ul>
<p>LTS Upgrades:</p>
<ul>
<li>Provide upgrade support upon customer request</li>
<li>Customer must be on a supported LTS version at the time of request</li>
<li>TSE must communicate unsupported LTS requests to the Account team as these require PS services</li>
</ul>
<p>Monthly Technical check-ins</p>
<ul>
<li>Conduct regularly scheduled technical check-ins with each BU</li>
<li>Discuss open support tickets, provide updates on product bugs and provide best practice recommendations based on your observations and ticket trends</li>
</ul>
<p>Knowledge Sharing/Technical Enablement:</p>
<ul>
<li>Knowledge exchange and continued technical enablement are crucial for the development of our team and the customer experience</li>
<li>It&#39;s essential that we keep our product expertise and documentation current and that all team members have access to information</li>
</ul>
<ul>
<li>Contribute to our reference documentation</li>
<li>Lead peer training</li>
<li>Consultant to our content teams</li>
<li>Own your personal technical education journey</li>
</ul>
<p>Project Involvement</p>
<ul>
<li>Contribute to or drive components of departmental and cross functional initiatives</li>
</ul>
<p>Partner with Leadership</p>
<ul>
<li>Identify areas of opportunity with potential solutions for inefficiencies or obstacles within the team and cross-functionally</li>
<li>Provide feedback to your manager on continued ed. opportunities, project ideas, etc.</li>
</ul>
<p>Requirements</p>
<ul>
<li>5+ years of support experience</li>
<li>3+ years of Big Data, Docker, Kubernetes and cloud technologies experience</li>
<li>Ability to Travel: This role will require 25% in-person travel for purposes including but not limited to new hire onboarding, team and department offsites, customer engagements, and other company events.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>265 000 zł-335 000 zł PLN</Salaryrange>
      <Skills>Big Data, Docker, Kubernetes, Cloud technologies, Security - Authentication, Authorization technologies, SSL/TLS, Linux Skills, DBMS Concepts/SQL Exposure</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Starburst</Employername>
      <Employerlogo>https://logos.yubhub.co/starburst.io.png</Employerlogo>
      <Employerdescription>Starburst is a data platform company that provides analytics, applications, and AI solutions. It has customers in over 60 countries.</Employerdescription>
      <Employerwebsite>https://www.starburst.io/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/starburst/jobs/5034562008</Applyto>
      <Location>Warsaw, Poland</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>f54568d1-7a9</externalid>
      <Title>Emerging Enterprise Account Executive</Title>
      <Description><![CDATA[<p>As an Emerging Enterprise Account Executive at Databricks, you will have a strong command of the SaaS sales process.</p>
<p>You will know how to sell innovation and change through customer vision expansion and guide deals forward to compress decision cycles.</p>
<p>You will love understanding a product in-depth and be passionate about communicating its value to customers and partners.</p>
<p>You will always be prospecting for new opportunities and close new accounts.</p>
<p>Along with the chance to close exciting deals, we also offer accelerators above 100% quota accomplishment.</p>
<p>You will report to a regional sales manager.</p>
<p>Your impact will be to assess your accounts and develop a strategy to identify and engage all buying centers.</p>
<p>You will use a solution approach to selling and creating value for customers.</p>
<p>You will identify the most viable use cases in each account to maximize Databricks&#39; impact.</p>
<p>You will orchestrate and work with teams to maximize the impact on your ecosystem.</p>
<p>You will build value with all engagements to promote successful negotiations to close point.</p>
<p>You will promote Databricks&#39; enterprise cloud data platform powered by Apache Spark.</p>
<p>You will be customer-focused by delivering technical and business results using the Databricks Platform.</p>
<p>You will promote teamwork...it makes the dream work!</p>
<p>We look for someone who has previously worked in an early-stage company and knows how to navigate and be successful in a fast-growing organization.</p>
<p>You will have sales experience within big data, Cloud, or SaaS sales.</p>
<p>You will have prior customer relationships with CIOs and important decision-makers.</p>
<p>You will simply articulate intricate cloud technologies and big data.</p>
<p>You will have experience exceeding sales quotas.</p>
<p>You will have success closing new accounts while upselling existing accounts.</p>
<p>You will have 2+ years of full-cycle closing sales experience in SaaS/PaaS companies.</p>
<p>You will have a Bachelor&#39;s Degree.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SaaS sales process, Cloud technologies, Big data, Customer relationship management, Solution selling, Account management, Teamwork</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a software company that provides a data and AI platform. It has over 10,000 customers worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/7716357002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8aadecbf-9e0</externalid>
      <Title>Geo Hunter Account Executive, Manufacturing</Title>
      <Description><![CDATA[<p>As a Geo Hunter Account Executive at Databricks, you will be a strategic sales professional experienced in selling into and activating Large Manufacturing accounts. You will know how to sell innovation and change through customer vision expansion and guide deals forward to compress decision cycles. You will love understanding a product in depth and be passionate about communicating its value to Customers and System Integrators.</p>
<p>Your responsibilities will include meeting with CIOs, IT executives, LOB executives, Program Managers, and other important partners, closing both new accounts and existing accounts, identifying and closing quick, small wins while managing longer, complex sales cycles, exceeding activity, pipeline, and revenue targets, tracking all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce, using a solution-based approach to selling and creating value for customers, promoting Databricks&#39; enterprise cloud data platform powered by Apache Spark, ensuring 100% satisfaction among all customers, prioritizing opportunities and applying appropriate resources, and building a plan for success internally at Databricks and externally with your accounts.</p>
<p>We look for individuals who have previously worked in an early stage company and know how to navigate and be successful, have field sales experience within big data, Cloud, or SaaS sales, have experience managing large, complex Manufacturing accounts, have prior customer relationships with CIOs, program managers, and essential decision makers, can simply articulate intricate cloud technologies, have 5+ years experience exceeding sales quotas, have success closing new accounts while working existing accounts, and have an understanding of Spark and big data.</p>
<p>The pay range for this role is $167,100-$229,800 USD, and the total compensation package may also include eligibility for annual performance bonus, equity, and benefits.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$167,100-$229,800 USD</Salaryrange>
      <Skills>big data, Cloud, SaaS sales, Salesforce, Apache Spark, customer relationship management, solution-based selling, Spark, cloud technologies</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform to over 10,000 organizations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8438296002</Applyto>
      <Location>California; Remote - Colorado; Remote - Oregon; Remote - Washington</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1e3127c0-24c</externalid>
      <Title>Platinum Support Representative, Tier 2</Title>
      <Description><![CDATA[<p>As a member of the Platinum Support team at Dialpad, you will deliver amazing service and support to our users by providing fast and accurate responses in a courteous and professional manner. You will handle user and partner inquiries ranging from simple product questions to more complex technical support issues. You will troubleshoot customer issues, escalate bug reports, and work to drive issue resolution. You will work effectively with a variety of internal teams, including Sales, Engineering, and Product Management. You will create and maintain tickets with our engineering team at a high technical level. You will monitor all live channels (chat, phone and web form) as you are scheduled to do so. You will communicate with Dialpad partners quickly and effectively in a professional manner. You will attend any and all advanced trainings to become an expert in our products and service. You will adhere to all policies and procedures set forth by Dialpad and the Director/Manager of the Platinum Support team. You will strive to be a team player and maintain the set SLA (Service Level Agreement) for each partner, Platinum support customer, general support interaction. You will maintain the required Quality Assurance score for the Platinum Support team. You will maintain the targeting number of tickets completed every week and strive for one-touch resolve.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$57,000-$72,000 CAD</Salaryrange>
      <Skills>College degree, Minimum of 5 years in customer support, Strong English skills, Good home computer and internet connectivity, Technical degree, Technical experience/knowledge surrounding LAN/WAN, Cloud technology and VoIP, Experience working in a call center environment</Skills>
      <Category>Customer Support</Category>
      <Industry>Technology</Industry>
      <Employername>Dialpad</Employername>
      <Employerlogo>https://logos.yubhub.co/dialpad.com.png</Employerlogo>
      <Employerdescription>Dialpad is a business communications platform that unifies calling, messaging, meetings, and contact center on a single platform powered by AI.</Employerdescription>
      <Employerwebsite>https://dialpad.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/dialpad/jobs/8483991002</Applyto>
      <Location>Kitchener, Canada</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c515a83d-ecc</externalid>
      <Title>Strategic Core Account Executive - Discrete Manufacturing</Title>
      <Description><![CDATA[<p>We are looking for a strategic and consultative Strategic Account Executive to join the team in Germany and maximise the significant market opportunity that exists for Databricks within the industrial and manufacturing sector.</p>
<p>As a primary driver of &quot;Industrie 4.0,&quot; this organisation is a cornerstone of the German economy, transforming how the world manufactures, moves, and heals. Dual-headquartered in Munich and Berlin, this powerhouse employs nearly 300,000 people across a vast multi-divisional ecosystem.</p>
<p>The impact you&#39;ll have:</p>
<ul>
<li>You will be part of the large account team for this flagship account, driving sustained growth across consumption, expansion, and new business.</li>
</ul>
<ul>
<li>You will consistently exceed growth targets by translating account strategy into clear, measurable commercial outcomes and disciplined execution.</li>
</ul>
<ul>
<li>You will operate as a trusted C-suite advisor, influencing executive decisions and shaping enterprise-wide Data &amp; AI transformation initiatives.</li>
</ul>
<ul>
<li>You will identify, prioritise, and scale high-value AI use cases to deliver measurable business outcomes.</li>
</ul>
<ul>
<li>You will drive partner-led growth, working closely with system integrators and strategic partners to increase deal momentum and customer impact.</li>
</ul>
<ul>
<li>You will lead complex, multi-stakeholder negotiations, closing transformational agreements that strengthen the strategic partnership.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Proven success selling advanced data, analytics, Big Data, AI, or complex cloud technology, closing complex, multi-stakeholder, multi-year enterprise agreements.</li>
</ul>
<ul>
<li>A proven record of exceeding ambitious revenue goals in large, global enterprise accounts within the Industrial/Manufacturing vertical in Germany.</li>
</ul>
<ul>
<li>Deep understanding of consumption-based growth models and how to scale strategic accounts from initial wins.</li>
</ul>
<ul>
<li>Proficiency in structured sales methodologies (e.g., MEDDPICC, Value Selling).</li>
</ul>
<ul>
<li>A history of building champion networks and leading cross-functional account teams around a clear strategy.</li>
</ul>
<ul>
<li>Candidates with a consulting background who combine a strong sales profile with deep Industrial vertical expertise are encouraged to apply.</li>
</ul>
<ul>
<li>Readiness to travel regularly within Germany and internationally to stay closely aligned with the customer.</li>
</ul>
<ul>
<li>Fluency in German and English, with the gravitas to influence senior leaders up to C-level.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>advanced data, analytics, Big Data, AI, complex cloud technology, structured sales methodologies, MEDDPICC, Value Selling</Skills>
      <Category>Sales</Category>
      <Industry>Manufacturing</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. It has over 10,000 customers worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8496220002</Applyto>
      <Location>Berlin, Germany</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1125d83c-1eb</externalid>
      <Title>Staff Software Engineer - Backend</Title>
      <Description><![CDATA[<p>As a Staff Software Engineer with a backend focus, you will work closely with your team and product management to prioritise, design, implement, test, and operate micro-services for the Databricks platform and product.</p>
<p>This involves writing software in Scala/Java, building data pipelines (Apache Spark, Apache Kafka), integrating with third-party applications, and interacting with cloud APIs (AWS, Azure, CloudFormation, Terraform).</p>
<p>You will be part of a team that builds highly technical products that fulfil real, important needs in the world. We constantly push the boundaries of data and AI technology, while simultaneously operating with the resilience, security and scale that is critical to making customers successful on our platform.</p>
<p>Our engineering teams build one of the largest scale software platforms. The fleet consists of millions of virtual machines, generating terabytes of logs and processing exabytes of data per day.</p>
<p>We run thousands of Kubernetes clusters across all regions and orchestrate millions of VMs on a daily basis.</p>
<p>Competencies:</p>
<ul>
<li>BS/MS/PhD in Computer Science, or a related field</li>
<li>10+ years of production level experience in one of: Java, Scala, C++, or similar language</li>
<li>Comfortable working towards a multi-year vision with incremental deliverables</li>
<li>Experience in architecting, developing, deploying, and operating large scale distributed systems</li>
<li>Experience working on a SaaS platform or with Service-Oriented Architectures</li>
<li>Good knowledge of SQL</li>
<li>Experience with software security and systems that handle sensitive data</li>
<li>Experience with cloud technologies, e.g. AWS, Azure, GCP, Docker, Kubernetes</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$182,400-$247,000 USD</Salaryrange>
      <Skills>Java, Scala, C++, Apache Spark, Apache Kafka, Cloud APIs, AWS, Azure, CloudFormation, Terraform, SQL, Software security, Cloud technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks enables data teams to solve the world&apos;s toughest problems by building and running the world&apos;s best data and AI infrastructure platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/6779233002</Applyto>
      <Location>Bellevue, Washington</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>5bca90fa-192</externalid>
      <Title>Enterprise Account Executive, Financial Services</Title>
      <Description><![CDATA[<p>As an Enterprise Account Executive at Databricks, you will be a strategic sales professional experienced in selling into Financial Services accounts. You will know how to sell innovation and change through customer vision expansion and guide deals forward to compress decision cycles.</p>
<p>You will love understanding a product in depth and be passionate about communicating its value to Customers and System Integrators. You will always be looking for new opportunities and be asked to grow within existing accounts.</p>
<p>Along with the chance to close an exciting deal, we also offer accelerators above 100% quota attainment.</p>
<p>The impact you will have:</p>
<ul>
<li>Meet with CIOs, IT executives, LOB executives, Program Managers, and other important partners</li>
<li>Close both new accounts and existing accounts</li>
<li>Identify and close quick, small wins while managing longer, complex sales cycles</li>
<li>Exceed activity, pipeline, and revenue targets</li>
<li>Track all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce</li>
<li>Use a solution-based approach to selling and creating value for customers</li>
<li>Promote Databricks&#39; enterprise cloud data platform powered by Apache Spark</li>
<li>Ensure 100% satisfaction among all customers</li>
<li>Prioritize opportunities and applying appropriate resources</li>
<li>Build a plan for success internally at Databricks and externally with your accounts</li>
</ul>
<p>What we look for:</p>
<ul>
<li>You have previously worked in an early stage company and you know how to navigate and be successful</li>
<li>Field sales experience within big data, Cloud, or SaaS sales</li>
<li>Prior customer relationships with CIOs, program managers, and essential decision makers</li>
<li>Simply articulate intricate cloud technologies</li>
<li>6+ years experience exceeding sales quotas</li>
<li>Expertise with financial services institutions preferable</li>
<li>Success closing new accounts while working existing accounts</li>
<li>Understanding of Spark and big data preferable</li>
<li>Passion for cloud technologies</li>
<li>Bachelor&#39;s Degree</li>
</ul>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$272,000-$374,000 USD</Salaryrange>
      <Skills>field sales experience, big data, Cloud, SaaS sales, prior customer relationships, CIOs, program managers, essential decision makers, simply articulate intricate cloud technologies, 6+ years experience exceeding sales quotas, expertise with financial services institutions, success closing new accounts while working existing accounts, understanding of Spark and big data, passion for cloud technologies, Apache Spark</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8367665002</Applyto>
      <Location>Boston, Massachusetts; New Jersey; New York City, New York</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>f5d41e5d-664</externalid>
      <Title>Strategic Enterprise Hunter Account Executive, Financial Services</Title>
      <Description><![CDATA[<p>As a Strategic Enterprise Hunter Account Executive at Databricks, you will be responsible for selling Databricks&#39; enterprise cloud data platform powered by Apache Spark to Financial Services clients. You will work closely with CIOs, IT executives, LOB executives, Program Managers, and other important partners to identify and close new accounts and existing accounts. Your responsibilities will include meeting with key decision-makers, identifying and closing quick, small wins while managing longer, complex sales cycles, exceeding activity, pipeline, and revenue targets, and building a plan for success internally at Databricks and externally with your accounts.</p>
<p>Key Requirements:</p>
<ul>
<li>6+ years of experience in field sales, preferably in big data, Cloud, or SaaS sales</li>
<li>Prior customer relationships with CIOs, program managers, and essential decision makers</li>
<li>Ability to simply articulate intricate cloud technologies</li>
<li>Greenfield + Hunter Sales Experience</li>
<li>Expertise with financial services institutions preferable</li>
<li>Understanding of Spark and big data preferable</li>
<li>Passion for cloud technologies</li>
</ul>
<p>In addition to a competitive salary, you will also be eligible for annual performance bonus, equity, and comprehensive benefits and perks.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$272,000-$374,000 USD</Salaryrange>
      <Skills>field sales, big data, Cloud, SaaS sales, customer relationships, cloud technologies, Spark, financial services institutions</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8346287002</Applyto>
      <Location>Boston, Massachusetts; New Jersey; New York City, New York; Remote - Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>ffd169d9-40b</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, data platforms &amp; analytics, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified data intelligence platform to over 10,000 organizations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461239002</Applyto>
      <Location>Atlanta, Georgia</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>30898795-a57</externalid>
      <Title>Strategic Enterprise Hunter Account Executive, Telco</Title>
      <Description><![CDATA[<p>As a Strategic Enterprise Hunter Account Executive at Databricks, you will be a strategic sales professional experienced in selling into enterprise accounts. Your role will involve guiding deals forward to compress decision cycles, understanding products in depth, and communicating their value to customers and system integrators.</p>
<p>Key responsibilities include meeting with CIOs, IT executives, LOB executives, program managers, and other important partners, closing both new and existing accounts, identifying and closing quick, small wins while managing longer, complex sales cycles, exceeding activity, pipeline, and revenue targets, and tracking customer details using Salesforce.</p>
<p>To succeed in this role, you will need to have previously worked in an early-stage company, have field sales experience within big data, cloud, or SaaS sales, and have hunter (net-new logo) experience. You will also need to have prior customer relationships with CIOs, program managers, and essential decision-makers, be able to simply articulate intricate cloud technologies, and have 7+ years of experience exceeding sales quotas.</p>
<p>In exchange for your hard work, Databricks offers accelerators above 100% quota attainment, the chance to close exciting deals, and a comprehensive benefits package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Cloud sales, Big data sales, SaaS sales, Hunter (net-new logo) experience, Prior customer relationships with CIOs, program managers, and essential decision-makers, Ability to simply articulate intricate cloud technologies, Spark and big data, Passion for cloud technologies</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. Over 10,000 organisations worldwide rely on its platform.</Employerdescription>
      <Employerwebsite>https://databricks.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8438124002</Applyto>
      <Location>Toronto, Canada</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>bac99a46-7f5</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461243002</Applyto>
      <Location>Denver, Colorado</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>34b561b0-b37</externalid>
      <Title>Director, Enterprise - Retail &amp; CPG</Title>
      <Description><![CDATA[<p>We are looking for a Sales Director, Strategic Accounts to join our growing business in Germany. As a Sales Director, you will lead a team of Strategic Account Executives across the Retail and CPG verticals, mentoring, guiding, and empowering them to achieve and exceed their goals.</p>
<p>Your primary focus will be on strategic account expansion, driving growth by expanding relationships with our most important customers. You will strengthen and scale the team through high-impact hiring, hands-on coaching, and by fostering a culture built on collaboration, accountability, and results.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Leveraging your business network to build a strong talent pipeline and hire top candidates as the team grows</li>
<li>Creating a clear regional growth and investment plan within your first 90 days</li>
<li>Building and sponsoring trusted relationships with customers and partners to drive long-term success in the region</li>
<li>Ensuring accurate forecasting and creating a predictable, high-growth business</li>
<li>Coaching your team to lead with a strong vision setting, methodology-based selling, and staying aligned to our customers&#39; goals and outcomes</li>
<li>Developing a solid understanding of our product&#39;s technical details and roadmap to earn trust with key stakeholders</li>
</ul>
<p>We are looking for a proven people leader with 7+ years of experience leading high-performing Enterprise sales teams that sell into strategic global accounts in Germany. You should have a proven track record of developing high-performing teams in similar high-growth, Data, AI, Cloud, or SaaS/Tech companies, consistently exceeding ambitious sales goals.</p>
<p>The ideal candidate will have extensive knowledge of the Retail and CPG vertical, and proven relationships within these accounts. You should know how to spot and grow great talent, building teams that raise the bar through trust, accountability, and shared success.</p>
<p>Fluent in German and English is essential, and willing and able to travel to customer sites on a regular basis is required, as well as some international travel for internal meetings/events.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Sales leadership, Strategic account management, Team management, Forecasting, Methodology-based selling, Data analysis, Cloud technology, SaaS/Tech, German language skills, Customer relationship management, Business development, Market research, Competitor analysis, Sales strategy development</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8287373002</Applyto>
      <Location>Hesse, Germany</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1dc80c94-f57</externalid>
      <Title>Senior Software Engineer (Money)</Title>
      <Description><![CDATA[<p>At Databricks, we are seeking a Senior Software Engineer to join our Money team in Bengaluru, India. As one of the first engineers for Money at Databricks India, you will be key to building a base for one of Databricks&#39; most central engineering teams.</p>
<p>You will own critical components that form the backbone of our products, starting with Databricks&#39; resource admission control and usage governance infrastructure. Your role is crucial in helping bring diverse business needs together, including abuse prevention, product commercialization motions, and reliable product availability at scale.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Owning Money systems and services that govern usage of all Databricks products and offerings.</li>
<li>Enhancing engineering and infrastructure efficiency, reliability, accuracy, and response times, including CI/CD processes, test frameworks, data quality assurance, end-to-end reconciliation, and anomaly detection.</li>
<li>Collaborating with platform and product teams to develop and implement innovative infrastructure that scales to meet evolving needs.</li>
<li>Contributing to long-term vision and requirements development for Databricks products, in partnership with our engineering teams.</li>
</ul>
<p>We are looking for a candidate with a strong background in software engineering, preferably in Java, Scala, C++, or similar languages. You should have 7+ years of production-level experience and a proven track record in architecting, developing, deploying, and operating components of large-scale distributed systems.</p>
<p>If you are passionate about delivering high-quality solutions and have a proactive approach, we encourage you to apply for this exciting opportunity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Java, Scala, C++, Software Security, Cloud Technologies, AWS, Azure, GCP, Docker, Kubernetes</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform to over 10,000 organizations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/7654347002</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>e50b3bd6-fbc</externalid>
      <Title>Strategic Enterprise Hunter Account Executive</Title>
      <Description><![CDATA[<p>As a Strategic Enterprise Hunter Account Executive at Databricks, you will be a strategic sales professional experienced in selling into enterprise accounts. You will know how to sell innovation and change through customer vision expansion and guide deals forward to compress decision cycles.</p>
<p>You will love understanding a product in depth and be passionate about communicating its value to customers and system integrators. You will always be looking for new opportunities and will be asked to close net new logos.</p>
<p>Along with the chance to close an exciting deal, we also offer accelerators above 100% quota attainment.</p>
<p>Your impact will be to:</p>
<ul>
<li>Meet with CIOs, IT executives, LOB executives, program managers, and other important partners</li>
<li>Close both new accounts and existing accounts</li>
<li>Identify and close quick, small wins while managing longer, complex sales cycles</li>
<li>Exceed activity, pipeline, and revenue targets</li>
<li>Track all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce</li>
<li>Use a solution-based approach to selling and creating value for customers</li>
<li>Promote Databricks&#39; enterprise cloud data platform powered by Apache Spark</li>
<li>Ensure 100% satisfaction among all customers</li>
<li>Prioritize opportunities and apply appropriate resources</li>
<li>Build a plan for success internally at Databricks and externally with your accounts</li>
</ul>
<p>We look for:</p>
<ul>
<li>Previous experience in an early-stage company and knowledge of how to navigate and be successful</li>
<li>Field sales experience within big data, cloud, or SaaS sales</li>
<li>Hunter (net-new logo) experience</li>
<li>Prior customer relationships with CIOs, program managers, and essential decision-makers</li>
<li>Ability to simply articulate intricate cloud technologies</li>
<li>7+ years of experience exceeding sales quotas</li>
<li>Success in closing new accounts while working existing accounts</li>
<li>Understanding of Spark and big data is preferable</li>
<li>Passion for cloud technologies</li>
<li>Bachelor&#39;s degree</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Sales, Cloud, Big Data, Apache Spark, Data Intelligence, Customer Vision Expansion, Solution-Based Selling, Hunter (Net-New Logo), Prior Customer Relationships, Intricate Cloud Technologies, Sales Quotas, Cloud Technologies</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI for over 10,000 organizations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8308566002</Applyto>
      <Location>Toronto, Canada</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b7f3fd73-10e</externalid>
      <Title>RVP, Financial Services (Hunter)</Title>
      <Description><![CDATA[<p>We are seeking an RVP, Financial Services (Hunter) to join our world-class sales organization. As a key member of our Exec Sales department, you will be responsible for hiring and leading a high-performing team of account executives, implementing sales plans, and helping to develop new business.</p>
<p>The ideal candidate has a track record of exceeding revenue goals, of leading sales professionals to be their best selves, and is comfortable in a complex, technical sales environment.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Drive revenue success: Own and exceed quarterly/annual sales targets.</li>
<li>Build and implement strategic plans: Develop and execute evolving revenue plans and growth tactics.</li>
<li>Build and manage the team: Hire, manage, and motivate a growing team of sales executives; coach each member via joint selling and regular pipeline reviews.</li>
<li>Build trust-based relationships: Develop long-term relationships with employees, partners, and cross-functional teams.</li>
<li>Distill customer needs and value: Enable your team to understand customer goals and how they relate to the Databricks value proposition.</li>
<li>Manage the front-line voice of Databricks: Lead your team to effectively communicate the value proposition through proposals and presentations.</li>
</ul>
<p>We look for candidates with 10+ years of successful, progressive experience in enterprise software sales, including Director-level experience at a reputable organization, ideally within Financial Services. In-depth knowledge of how software is positioned and sold to IT and/or Data executives, ideally within the insurance, wealth &amp; asset management verticals, is also required.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>enterprise software sales, director-level experience, financial services, software positioning, IT and/or Data executives, insurance, wealth &amp; asset management, complex sales environment, Big Data, AI, Spark, cloud technologies</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. Over 10,000 organizations worldwide rely on Databricks.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8459033002</Applyto>
      <Location>San Francisco, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1e0df4a3-dc0</externalid>
      <Title>Strategic Enterprise Account Executive - Life Sciences</Title>
      <Description><![CDATA[<p>As a Strategic Enterprise Account Executive at Databricks, you will be responsible for maintaining and growing a single existing account in the life sciences industry. You will work closely with CIOs, IT executives, LOB executives, Program Managers, and other important partners to identify and close quick, small wins while managing longer, complex sales cycles.</p>
<p>Your key responsibilities will include:</p>
<ul>
<li>Meeting with CIOs, IT executives, LOB executives, Program Managers, and other important partners</li>
<li>Closing both new accounts and existing accounts</li>
<li>Identifying and closing quick, small wins while managing longer, complex sales cycles</li>
<li>Exceeding activity, pipeline, and revenue targets</li>
<li>Tracking all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce</li>
<li>Using a solution-based approach to selling and creating value for customers</li>
<li>Promoting Databricks&#39; enterprise cloud data platform powered by Apache Spark</li>
<li>Ensuring 100% satisfaction among all customers</li>
<li>Prioritizing opportunities and applying appropriate resources</li>
<li>Building a plan for success internally at Databricks and externally with your accounts</li>
</ul>
<p>To succeed in this role, you will need to have:</p>
<ul>
<li>7+ years of experience exceeding sales quotas</li>
<li>Field sales experience within big data, Cloud, or SaaS sales</li>
<li>Experience managing large, complex Life Sciences accounts is preferred</li>
<li>Prior customer relationships with CIOs, program managers, and essential decision makers</li>
<li>The ability to simply articulate intricate cloud technologies</li>
<li>A passion for cloud technologies</li>
<li>A Bachelor&#39;s Degree</li>
</ul>
<p>In addition to a competitive salary, you will also be eligible for annual performance bonus, equity, and benefits.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$272,000-$374,000 USD</Salaryrange>
      <Skills>field sales experience, big data, Cloud, SaaS sales, sales quotas, customer relationships, cloud technologies, Apache Spark, Delta Lake, MLflow</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company, founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8359439002</Applyto>
      <Location>Remote - California; Remote - Washington</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3922bc3d-027</externalid>
      <Title>Staff Software Engineer - Backend</Title>
      <Description><![CDATA[<p>At Databricks, we are obsessed with enabling data teams to solve the world&#39;s toughest problems, from security threat detection to cancer drug development. We do this by building and running the world&#39;s best data and AI infrastructure platform, so our customers can focus on the high-value challenges that are central to their own missions.</p>
<p>As a software engineer with a backend focus, you will work closely with your team and product management to prioritise, design, implement, test, and operate micro-services for the Databricks platform and product. This implies, among others, writing software in Scala/Java, building data pipelines (Apache Spark, Apache Kafka), integrating with third-party applications, and interacting with cloud APIs (AWS, Azure, CloudFormation, Terraform).</p>
<p>Some example teams you can join include:</p>
<p>Data Science and Machine Learning Infrastructure: Build services and infrastructure at the intersection of machine learning and distributed systems. Compute Fabric: Build the resource management infrastructure powering all the big data and machine learning workloads on the Databricks platform in a robust, flexible, secure, and cloud-agnostic way. Data Plane Storage: Deliver reliable and high-performance services and client libraries for storing and accessing humongous amounts of data on cloud storage backends, e.g., AWS S3, Azure Blob Store. Enterprise Platform: Offer a simple and powerful experience for onboarding and managing all of their data teams across 10ks of users on the Databricks platform. Observability: Provide a world-class platform for Databricks engineers to comprehensively observe and introspect their applications and services. Service Platform: Build high-quality services and manage the services in all environments in a unified way. Core Infra: Build the core infrastructure that powers Databricks, making it available across all geographic regions and Cloud providers.</p>
<p>The ideal candidate will have:</p>
<ul>
<li>BS/MS/PhD in Computer Science, or a related field</li>
<li>10+ years of production-level experience in one of: Java, Scala, C++, or similar language</li>
<li>Comfortable working towards a multi-year vision with incremental deliverables</li>
<li>Experience in architecting, developing, deploying, and operating large-scale distributed systems</li>
<li>Experience working on a SaaS platform or with Service-Oriented Architectures</li>
<li>Good knowledge of SQL</li>
<li>Experience with software security and systems that handle sensitive data</li>
<li>Experience with cloud technologies, e.g. AWS, Azure, GCP, Docker, Kubernetes</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$192,000-$260,000 USD</Salaryrange>
      <Skills>Java, Scala, C++, Apache Spark, Apache Kafka, Cloud APIs, AWS, Azure, CloudFormation, Terraform, SQL, Software security, Cloud technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks enables data teams to solve the world&apos;s toughest problems by building and running the world&apos;s best data and AI infrastructure platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/6544443002</Applyto>
      <Location>Mountain View, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>219928ef-6de</externalid>
      <Title>Resident Solutions Architect - Healthcare &amp; Life Sciences</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494148002</Applyto>
      <Location>Philadelphia, Pennsylvania</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>ef4b5565-792</externalid>
      <Title>Manager, Software Engineering</Title>
      <Description><![CDATA[<p>We are seeking a Manager, Software Engineering to lead a team of software engineers in delivering a variety of software integrated into our products. This includes autonomy, simulation, data processing, payload integration, and off-board command and control or decision support.</p>
<p>As a Manager, Software Engineering, you will be responsible for demonstrating end-to-end outcome ownership of a major system within an integrated product, and the team responsible for building and maintaining it. You will contribute as a team lead to the rapid architecting, design, delivery, support, and evolution of next-generation autonomous platforms through their entire product life-cycle.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Empathizing with end-users and driving solutions that balance their needs with external constraints, restrictions, and requirements in a multi-stakeholder environment</li>
<li>Being accountable for software-enabled solutions that are deployed to customers, optimizing for the delivery of value to the end-user</li>
<li>Collaborating with your Technical Lead to architect scalable software that rapidly delivers capability beyond the scope of current platforms, with a clear path for both architecture and capability evolution over time</li>
<li>Coordinating your team&#39;s roadmap and execution with other teams across Anduril, with the aim of developing components that are reusable across multiple Anduril product lines</li>
<li>Contributing to the design, implementation, and execution of development processes for the initial delivery and subsequent iteration of vehicle and mission software, including full lifecycle testing, monitoring, and operation</li>
<li>Managing a 6-18 month roadmap for your team, nested within broader organisational roadmap</li>
<li>Managing an allocated budget for your team</li>
<li>Managing programmatic risk for your team, and collaborating with your Technical Lead to manage technical risk, including sound and timely decision making</li>
<li>Leading by example as a technically competent, trustworthy, and accountable team lead</li>
<li>Communicating organisational vision, strategy, and direction to your team</li>
<li>Defining, documenting, gaining consensus for, and communicating appropriate goals and plans for your team, derived from broader organisational vision, strategy, and priorities</li>
<li>Building your team through mentoring, professional development, career management, and collaboration with Anduril&#39;s recruiting and people functions</li>
<li>Working as a leader of a multi-disciplinary engineering team of 4-10 members, including as a mentor and manager for Engineers from differing backgrounds</li>
<li>Reporting to a manager who may or may not have a background in software engineering</li>
<li>Traveling to co-locate with end-users and/or other teams up to 20% of the time</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Bachelor&apos;s degree in Robotics, Mechatronics, Computer Science, Engineering, or equivalent experience, Experience in a management position within a high-performing technology organisation, Extensive professional experience working as a Software Engineer with one or more domains and/or technologies of expertise, Capacity to lead a team that works holistically on software-enabled capabilities up and down the software stack and through lifecycle through design, implementation, operation, and sustainment, Capacity to act as the owner for a software system, including stakeholder engagement, requirements definition, roadmap management, team coordination, design, implementation management, sustainment, and evolution, Experience in a senior role for the delivery of a military mission system and/or autonomous vehicle, Experience writing backend services or embedded software in C, C++, Rust, and/or Go, Experience writing frontend applications using Typescript and React, Experience working with a RTOS, Experience with the design, implementation, and operation of horizontally scaled cloud technologies, Experience with the design, implementation, and support of embedded software, particularly in the field of robotics, Experience with modeling and simulation, Familiarity with communications busses and protocols (e.g., CAN, CANFD, UART/RS232/RS422/RS485, SPI, QSPI, I2C, Ethernet, ARINC-825, ARINC-429, MIL-STD-1553, etc.), Experience with development of high-assurance safety-critical software, including with DO-178, IEC 61508, or similar standards, Experience in design and development of embedded applications in autonomous vehicle software systems, Experience in developing interfaces to sensors and actuators, Experience working with and testing electrical and mechanical systems, Familiarity with navigation and communications systems, Experience within the product delivery lifecycle, including manufacturing, system acceptance, deployment, and sustainment, Familiarity with Systems Engineering concepts</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anduril Industries</Employername>
      <Employerlogo>https://logos.yubhub.co/anduril.com.png</Employerlogo>
      <Employerdescription>Anduril Industries is a defense technology company that specializes in transforming U.S. and allied military capabilities with advanced technology.</Employerdescription>
      <Employerwebsite>https://www.anduril.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/andurilindustries/jobs/4950096007</Applyto>
      <Location>Sydney, New South Wales, Australia</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b34dfe7b-d84</externalid>
      <Title>Senior Software Engineer - Backend</Title>
      <Description><![CDATA[<p>We are seeking a Senior Software Engineer - Backend to join our team in Vancouver. As a Senior Software Engineer, you will be responsible for designing, developing, and maintaining large-scale distributed systems. You will work on a variety of projects, including Log Analytics, AI/BI, Unity Catalog Business Semantics, and Databricks Apps.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Design and develop large-scale distributed systems using Java, Scala, or C++</li>
<li>Develop and maintain high-quality code that meets the requirements of the project</li>
<li>Collaborate with cross-functional teams to identify and prioritize project requirements</li>
<li>Troubleshoot and resolve complex technical issues</li>
<li>Stay up-to-date with industry trends and emerging technologies</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Bachelor&#39;s degree in Computer Science or related field</li>
<li>5+ years of experience in software development</li>
<li>Strong foundation in algorithms and data structures</li>
<li>Experience with cloud technologies, such as AWS, Azure, or GCP</li>
<li>Experience with security and systems that handle sensitive data</li>
<li>Good knowledge of SQL</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Master&#39;s degree in Computer Science or related field</li>
<li>Experience with big data technologies, such as Hadoop or Spark</li>
<li>Experience with containerization, such as Docker</li>
<li>Experience with DevOps practices, such as continuous integration and delivery</li>
</ul>
<p>Pay Range Transparency The pay range for this role is $146,200-$201,100 CAD per year, depending on experience and qualifications.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$146,200-$201,100 CAD</Salaryrange>
      <Skills>Java, Scala, C++, Cloud technologies, Security, SQL, Big data technologies, Containerization, DevOps practices</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI. It has over 10,000 customers worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8093295002</Applyto>
      <Location>Vancouver, Canada</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>38f38a04-ff3</externalid>
      <Title>Strategic Enterprise Account Executive, Healthcare</Title>
      <Description><![CDATA[<p>As a Strategic Enterprise Account Executive at Databricks, you will be responsible for maintaining and growing a single existing account in Chicago, IL. This role requires a strategic sales professional with experience in selling into a single Large Healthcare account. You will need to understand how to sell innovation and change through customer vision expansion and guide deals forward to compress decision cycles.</p>
<p>Key responsibilities include: Meeting with CIOs, IT executives, LOB executives, Program Managers, and other important partners Closing both new accounts and existing accounts Identifying and closing quick, small wins while managing longer, complex sales cycles Exceeding activity, pipeline, and revenue targets Tracking all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce Using a solution-based approach to selling and creating value for customers Promoting Databricks&#39; enterprise cloud data platform powered by Apache Spark™ Ensuring 100% satisfaction among all customers Prioritising opportunities and applying appropriate resources Building a plan for success internally at Databricks and externally with your accounts</p>
<p>We are looking for someone with: Previous experience in an early stage company and knowledge of how to navigate and be successful Field sales experience within big data, Cloud, or SaaS sales Experience managing large, complex Healthcare accounts is preferred Prior customer relationships with CIOs, program managers, and essential decision makers The ability to simply articulate intricate cloud technologies 7+ years experience exceeding sales quotas Success closing new accounts while working existing accounts Understanding of Spark and big data preferable Passion for cloud technologies Bachelor&#39;s Degree</p>
<p>The pay range for this role is $272,000-$374,000 USD, and the total compensation package may also include eligibility for annual performance bonus, equity, and benefits.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$272,000-$374,000 USD</Salaryrange>
      <Skills>field sales experience, big data, Cloud, SaaS sales, Spark, customer relationships, sales quotas, cloud technologies, Apache Spark, Databricks Data Intelligence Platform, data analytics, AI</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that relies on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI for over 10,000 organisations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8417773002</Applyto>
      <Location>Chicago, Illinois; Remote - Illinois</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>f18e7306-00c</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark and knowledge of Apache Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have:</p>
<ul>
<li>Databricks Certification</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, Databricks, CI/CD, MLOps, technical project delivery, documentation, white-boarding, client management, conflict management, scalable streaming, batch solutions, cloud-native components</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a companies that provides data and AI solutions. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461325002</Applyto>
      <Location>Philadelphia, Pennsylvania</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>939ef2f9-52f</externalid>
      <Title>Core Account Executive - Hong Kong</Title>
      <Description><![CDATA[<p>As a Core Account Executive in Databricks, you will be responsible for selling Databricks&#39; enterprise cloud data platform powered by Apache Spark to customers in the Greater China region.</p>
<p>Your key responsibilities will include:</p>
<ul>
<li>Presenting a territory plan within the first 90 days</li>
<li>Meeting with CIOs, IT executives, LOB executives, Program Managers, and other important partners</li>
<li>Closing both new accounts and existing accounts</li>
<li>Identifying and closing quick, small wins while managing longer, complex sales cycles</li>
<li>Exceeding activity, pipeline, and revenue targets</li>
<li>Tracking all customer details, including use case, purchase time frames, next steps, and forecasting in Salesforce</li>
</ul>
<p>You will report to the Director of Enterprise Sales and will be expected to engage with customers in Mandarin/Cantonese to conduct technical and business discussions, address sales challenges, and present clear value propositions and outcomes.</p>
<p>To be successful in this role, you will need to have:</p>
<ul>
<li>Field sales experience within big data, Cloud, and SaaS sales, covering Greater China territory</li>
<li>Prior customer relationships with CIOs, program managers, and essential decision makers</li>
<li>The ability to simply articulate intricate cloud technologies</li>
<li>7+ years of Enterprise Sales experience exceeding quotas, covering relevant accounts and industries</li>
<li>Success in closing new accounts while working on existing accounts</li>
<li>Understanding of Spark and big data is preferable</li>
<li>Business proficiency in Mandarin/Cantonese and experience in the GCR territory are required</li>
</ul>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please click here.</p>
<p>Our Commitment to Diversity and Inclusion</p>
<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Field sales experience, Big data, Cloud, and SaaS sales, Greater China territory, Prior customer relationships, CIOs, program managers, and essential decision makers, Ability to simply articulate intricate cloud technologies, Enterprise Sales experience, Exceeding quotas, Relevant accounts and industries, Success in closing new accounts, Understanding of Spark and big data, Business proficiency in Mandarin/Cantonese, Experience in the GCR territory</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8470087002</Applyto>
      <Location>Hong Kong</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>678647af-3f7</externalid>
      <Title>Staff Software Engineer (Money)</Title>
      <Description><![CDATA[<p>We are seeking a Staff Software Engineer to join our Money team at Databricks India. As one of the first engineers for Money at Databricks India, you will be key to building a base for one of Databricks&#39; most central engineering teams.</p>
<p>Your role is crucial in helping bring diverse business needs together, including abuse prevention, product commercialisation motions, and reliable product availability at scale. You will work closely with infrastructure as well as product teams in bringing critical governance functionality to Databricks customers.</p>
<p>Responsibilities:</p>
<ul>
<li>Own Money systems and services that govern usage of all Databricks products and offerings.</li>
<li>Enhance engineering and infrastructure efficiency, reliability, accuracy, and response times, including CI/CD processes, test frameworks, data quality assurance, end-to-end reconciliation, and anomaly detection.</li>
<li>Collaborate with platform and product teams to develop and implement innovative infrastructure that scales to meet evolving needs.</li>
<li>Provide leadership in long-term vision and requirements development for Databricks products, in partnership with our engineering teams.</li>
<li>Represent Databricks at academic and industrial conferences &amp; events.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>BS/MS/PhD in Computer Science, or a related field</li>
<li>12+ years of production level experience in one of: Java, Scala, C++, or similar language.</li>
<li>Comfortable working towards a multi-year vision with incremental deliverables.</li>
<li>Proven track record in architecting, developing, deploying, and operating large scale distributed systems.</li>
<li>Experience with software security and systems that handle sensitive data.</li>
<li>Demonstrated leadership skills and the ability to lead across functional and organizational boundaries.</li>
<li>A proactive approach and a passion for delivering high-quality solutions.</li>
<li>Experience with cloud technologies, e.g. AWS, Azure, GCP, Docker, Kubernetes.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Java, Scala, C++, Cloud technologies, Software security, Distributed systems, Leadership skills, AWS, Azure, GCP, Docker, Kubernetes</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform to over 10,000 organizations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/7654349002</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>61b49b86-6c8</externalid>
      <Title>Resident Solutions Architect - Manufacturing</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect (RSA) on our Professional Services team, you will work with customers on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>The impact you will have:</p>
<ul>
<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues</li>
</ul>
<ul>
<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>You will report to the regional Manager/Lead.</p>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines</li>
</ul>
<ul>
<li>Documentation and white-boarding skills</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
</ul>
<ul>
<li>Ability to travel up to 30% when needed</li>
</ul>
<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8341313002</Applyto>
      <Location>New York City, New York</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>9a8c0e7b-f30</externalid>
      <Title>Senior Solutions Engineering Manager, Enterprise, Named Accounts</Title>
      <Description><![CDATA[<p>About Us</p>
<p>At Cloudflare, we are on a mission to help build a better Internet. Today the company runs one of the world’s largest networks that powers millions of websites and other Internet properties for customers ranging from individual bloggers to SMBs to Fortune 500 companies. Cloudflare protects and accelerates any Internet application online without adding hardware, installing software, or changing a line of code.</p>
<p>As a Senior Solutions Engineering Manager, you will be responsible for managing a cross-functional team of Solutions Engineers who work with Sales to grow our account base. The team partners with Enterprise Named teams, Customer Success, and Partner organisations to provide the value of Cloudflare products to customers and prospects.</p>
<p>Responsibilities</p>
<ul>
<li>Manage a team of Solutions Engineers who are geographically distributed in the US East Enterprise Named region.</li>
<li>Work with Sales to grow our account base.</li>
<li>Partner with Enterprise Named teams, Customer Success, and Partner organisations to provide the value of Cloudflare products to customers and prospects.</li>
</ul>
<p>Requirements</p>
<ul>
<li>8+ years of relevant experience with a Bachelor’s Degree in Computer Science, Engineering, Management Information Systems or another related field or its equivalent.</li>
<li>5+ years of experience with internet technologies / protocols and / or cloud technologies.</li>
<li>Minimum 3 year of team lead/team management experience.</li>
<li>Experience managing opportunity sales cycles and identifying technical proof activities.</li>
<li>Demonstrated operations and organisation skills implementation and driving best practices with cross-functional organisations.</li>
</ul>
<p>Benefits</p>
<ul>
<li>Estimated annual salary of $233,000 - $285,000 for New York City, Washington D.C. and Seattle based hires.</li>
<li>Equity participation in Cloudflare’s equity plan.</li>
<li>Comprehensive benefits package, including health and welfare benefits, financial benefits, and time off.</li>
</ul>
<p>What Makes Cloudflare Special?</p>
<ul>
<li>We are a highly ambitious, large-scale technology company with a soul.</li>
<li>We are committed to protecting the free and open Internet.</li>
<li>We have a strong culture of innovation and collaboration.</li>
</ul>
<p>If you are passionate about technology and people, and have the ability to explain complex technical concepts in easy-to-understand terms, we would love to hear from you!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>distributed</Workarrangement>
      <Salaryrange>$233,000 - $285,000</Salaryrange>
      <Skills>internet technologies, cloud technologies, team lead/team management, operations and organisation skills, cross-functional organisations</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Cloudflare</Employername>
      <Employerlogo>https://logos.yubhub.co/cloudflare.com.png</Employerlogo>
      <Employerdescription>Cloudflare is a technology company that helps build a better Internet by protecting and accelerating any Internet application online without adding hardware, installing software, or changing a line of code.</Employerdescription>
      <Employerwebsite>https://www.cloudflare.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/cloudflare/jobs/7549312</Applyto>
      <Location>Distributed</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>36c9df64-9b2</externalid>
      <Title>Senior Software Engineer, App Foundation (Backend)</Title>
      <Description><![CDATA[<p>Join Airbnb&#39;s App Foundation team, a cross-platform team that builds high-quality and performant capabilities that power almost all features in the Guest and Host ecosystem.</p>
<p>As a Senior Software Engineer, you will be responsible for exploring, shaping, and developing new product experiences alongside cross-functional partners (design and product); from ideation to implementation at scale.</p>
<p>You will build efficient and reusable backend capabilities, with high quality, while making sure to maintain performance and scalable systems.</p>
<p>Lead initiatives that measurably improve Guest and Host experience by improving app responsiveness, scale efficiently and reliability across key backend paths that impact millions.</p>
<p>Drive a performance roadmap: identifying bottlenecks, prioritizing work by impact, and delivering improvements across services, data access patterns, and infrastructure.</p>
<p>Raise the bar on performance engineering by building tooling, benchmarks, and guardrails that prevent regressions and make performance a first-class part of how teams ship.</p>
<p>Influence architecture and standards across Airbnb’s backend ecosystem, making systems more observable, more efficient, and easier to evolve.</p>
<p>Millions of users across the world engage with the Airbnb app in multiple languages every day. As an engineer on the App Foundation team, you would be critical to the continued success and broad appeal of Airbnb.</p>
<p>In this role, you will have an opportunity to:</p>
<p>Work collaboratively in cross-functional teams with design, product and data science partners, to define and ship impactful features.</p>
<p>Propose architectural patterns of a high-scale applications, such as well-designed APIs, data pipelines and efficient algorithms</p>
<p>Writing unit and integration tests, reviewing other’s code</p>
<p>Review service-level performance metrics and triage anomalies or regressions.</p>
<p>Profile and debug performance issues across service boundaries and implement fixes (e.g., query optimization, caching strategies, concurrency improvements, payload reduction).</p>
<p>Partner with engineers across teams to improve critical request flows - aligning on SLOs, rollout plans, and measurement strategies.</p>
<p>Participate in code reviews and architecture discussions with a performance lens, helping teams ship changes safely and efficiently</p>
<p>Document learnings and create playbooks so performance improvements scale beyond a single service or team.</p>
<p>Your Expertise:</p>
<p>5+ years of software development experience</p>
<p>Strong expertise in one or more back-end server languages (Java/Kotlin/C++/etc.)</p>
<p>Experience in building and scaling high-quality and high-traffic products (or systems) in a distributed manner.</p>
<p>Deep backend expertise, including proficiency with databases, cloud technologies, and asynchronous messaging systems.</p>
<p>End-to-end ownership mentality that transcends team boundaries.</p>
<p>Passion for building strong collaborative relationships with other engineering &amp; product partners</p>
<p>Want to tackle projects with large open-ended scope and drive significant business impact</p>
<p>Able to self-serve on data analysis and make data driven decisions</p>
<p>Rigorous attention to detail and the ability to tackle ambiguous problems</p>
<p>Embrace the ever-changing culture, prioritize breadth over depth but can still go in-depth when needed.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$191,000-$223,000 USD</Salaryrange>
      <Skills>Java, Kotlin, C++, databases, cloud technologies, asynchronous messaging systems, APIs, data pipelines, efficient algorithms, unit testing, integration testing, code review, architecture discussion</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Airbnb</Employername>
      <Employerlogo>https://logos.yubhub.co/airbnb.com.png</Employerlogo>
      <Employerdescription>Airbnb is a global online marketplace for short-term vacation rentals. It has grown to over 5 million hosts who have welcomed over 2 billion guest arrivals.</Employerdescription>
      <Employerwebsite>https://www.airbnb.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/airbnb/jobs/7717198</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6ea8bf6b-ef6</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494153002</Applyto>
      <Location>Philadelphia, Pennsylvania</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>f08d7a20-ff7</externalid>
      <Title>Strategic Core Account Executive</Title>
      <Description><![CDATA[<p>As a Strategic Enterprise Account Executive at Databricks, you will be responsible for selling into CMEG accounts specific to Gaming/Betting. You will need to understand the product in depth and communicate its value to customers and system integrators. Your goal will be to close deals and exceed activity, pipeline, and revenue targets.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Meeting with CIOs, IT executives, LOB executives, program managers, and other important partners</li>
<li>Closing both new accounts and existing accounts</li>
<li>Identifying and closing quick, small wins while managing longer, complex sales cycles</li>
<li>Exceeding activity, pipeline, and revenue targets</li>
<li>Tracking all customer details including use case, purchase time frames, next steps, and forecasting in Salesforce</li>
<li>Using a solution-based approach to selling and creating value for customers</li>
<li>Promoting Databricks&#39; enterprise cloud data platform powered by Apache Spark™</li>
<li>Ensuring 100% satisfaction among all customers</li>
<li>Prioritising opportunities and applying appropriate resources</li>
<li>Building a plan for success internally at Databricks and externally with your accounts</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>Previous experience in an early-stage company</li>
<li>Field sales experience within big data, Cloud, or SaaS sales</li>
<li>Prior customer relationships with CIOs, program managers, and essential decision makers</li>
<li>Ability to simply articulate intricate cloud technologies</li>
<li>7+ years experience exceeding sales quotas</li>
<li>Expertise with financial services institutions preferable</li>
<li>Success closing new accounts while working existing accounts</li>
<li>Understanding of Spark and big data preferable</li>
<li>Passion for cloud technologies</li>
<li>Bachelor&#39;s Degree</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Cloud technologies, Big data, Sales, Customer relationship management, Solution-based selling, Apache Spark, Delta Lake, MLflow, Financial services institutions, Gaming/Betting industry</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform. It has over 10,000 organisations worldwide as clients.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8477727002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>62b2a5a2-9bd</externalid>
      <Title>Big Data Solutions Architect (Professional Services)</Title>
      <Description><![CDATA[<p>As a Big Data Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Working on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Working with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guiding strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consulting on architecture and design; bootstrapping or implementing customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>
</ul>
<ul>
<li>Providing an escalated level of support for customer operational issues</li>
</ul>
<ul>
<li>Working with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
</ul>
<ul>
<li>Working with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Strong expertise in data warehousing concepts, architecture, and migration strategies</li>
</ul>
<ul>
<li>Comfortable writing code in either Python, Pyspark or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines</li>
</ul>
<ul>
<li>Documentation and white-boarding skills</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
</ul>
<ul>
<li>Data Science expertise is a nice-to-have</li>
</ul>
<ul>
<li>Travel to customers 10-20% of the time</li>
</ul>
<ul>
<li>Databricks Certification</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, data warehousing, migration strategies, Python, Pyspark, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8482697002</Applyto>
      <Location>Paris, France</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>9d5fcc78-b2b</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Python, Scala, AWS, Azure, GCP, distributed computing, Spark runtime internals</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8423296002</Applyto>
      <Location>Central - United States; Northeast - United States; Southeast - United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1044456b-79a</externalid>
      <Title>Staff Software Engineer - Backend</Title>
      <Description><![CDATA[<p>We are obsessed with enabling data teams to solve the world&#39;s toughest problems. As a software engineer with a backend focus, you will work closely with your team and product management to prioritise, design, implement, test, and operate micro-services for the Databricks platform and product.</p>
<p>This implies, among others, writing software in Scala/Java, building data pipelines (Apache Spark, Apache Kafka), integrating with third-party applications, and interacting with cloud APIs (AWS, Azure, CloudFormation, Terraform).</p>
<p>You will be part of one of the following teams:</p>
<p>Data Science and Machine Learning Infrastructure: Build services and infrastructure at the intersection of machine learning and distributed systems. Compute Fabric: Build the resource management infrastructure powering all the big data and machine learning workloads on the Databricks platform in a robust, flexible, secure, and cloud-agnostic way. Data Plane Storage: Deliver reliable and high performance services and client libraries for storing and accessing humongous amount of data on cloud storage backends, e.g., AWS S3, Azure Blob Store. Enterprise Platform: Offer a simple and powerful experience for onboarding and managing all of their data teams across 10ks of users on the Databricks platform. Observability: Provide a world class platform for Databricks engineers to comprehensively observe and introspect their applications and services. Service Platform: Build high-quality services and manage the services in all environments in a unified way. Core Infra: Build the core infrastructure that powers Databricks, making it available across all geographic regions and Cloud providers.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$182,400-$247,000 USD</Salaryrange>
      <Skills>Scala, Java, Apache Spark, Apache Kafka, Cloud APIs (AWS, Azure, CloudFormation, Terraform), SQL, Software security, Cloud technologies (AWS, Azure, GCP, Docker, Kubernetes)</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a global organisation that builds and runs the world&apos;s best data and AI infrastructure platform. It was founded in 2013 by the original creators of Apache Spark.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/6779232002</Applyto>
      <Location>Seattle, Washington</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c43012f4-e47</externalid>
      <Title>RVP, Insurance, Wealth &amp; Asset Management</Title>
      <Description><![CDATA[<p>We are seeking an RVP, Insurance, Wealth &amp; Asset Management to join our Financial Services Team. As a key member of our sales organization, you will be responsible for hiring and leading a team of Strategic sales reps, implementing sales plans, and helping to develop new business and expand existing business.</p>
<p>The ideal candidate has a track record of exceeding revenue goals, leading sales professionals to be their best selves, and is comfortable in a complex, technical sales environment.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Drive revenue success: Own and exceed quarterly/annual sales targets.</li>
<li>Build and implement strategic plans: Develop and execute evolving revenue plans and growth tactics.</li>
<li>Build and manage the team: Hire, manage, and motivate a growing team of sales executives; coach each member via joint selling and regular pipeline reviews.</li>
<li>Build trust-based relationships: Develop long-term relationships with employees, partners, and cross-functional teams.</li>
<li>Distill customer needs and value: Enable your team to understand customer goals and how they relate to the Databricks value proposition.</li>
<li>Manage the front-line voice of Databricks: Lead your team to effectively communicate the value proposition through proposals and presentations.</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>10+ years of successful, progressive experience in enterprise software sales, including Director-level experience at a reputable organization, ideally within Financial Services.</li>
<li>In-depth knowledge of how software is positioned and sold to IT and/or Data executives, ideally within the insurance, wealth &amp; asset management verticals.</li>
<li>Strong track record of exceeding company sales quotas in a complex sales environment.</li>
<li>A background in hiring, leading, and retaining high-performing account executives.</li>
<li>Proven experience with teaching, coaching, and training sales methodologies.</li>
<li>Excellent C-level communication skills.</li>
<li>Proven leadership ability to influence, develop, and empower employees to achieve objectives.</li>
<li>Contract negotiation and deal forecasting experience.</li>
<li>Strong written, verbal, presentation, and organizational skills required, with the ability to articulate and evangelize the value of Databricks solutions.</li>
<li>Passionate about Big Data, AI, Spark, and cloud technologies.</li>
<li>Willingness to travel as needed within the Northeast/New England region.</li>
<li>Bachelor’s Degree required.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise software sales, Financial Services, Big Data, AI, Spark, Cloud technologies, Contract negotiation, Deal forecasting, C-level communication, Leadership, Hiring, Leading, Retaining high-performing account executives, Teaching, Coaching, Training sales methodologies</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8459035002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8131cff5-1a9</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD Zone 2 Pay Range $180,656-$248,360 USD Zone 3 Pay Range $180,656-$248,360 USD Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8341311002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b647b7da-f8f</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect (RSA) on our Professional Services team, you will work with customers on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues</li>
</ul>
<ul>
<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>What we look for:</p>
<ul>
<li>US Top Secret Clearance Required this position</li>
</ul>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines</li>
</ul>
<ul>
<li>Documentation and white-boarding skills</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
</ul>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience</li>
</ul>
<ul>
<li>Ability to travel up to 30% when needed</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494107002</Applyto>
      <Location>Virginia</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8d1ca2f5-7be</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461220002</Applyto>
      <Location>Chicago, Illinois</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>21860f67-527</externalid>
      <Title>Staff Software Engineer - Backend</Title>
      <Description><![CDATA[<p>At Databricks, we are obsessed with enabling data teams to solve the world&#39;s toughest problems. We do this by building and running the world&#39;s best data and AI infrastructure platform, so our customers can focus on the high-value challenges that are central to their own missions.</p>
<p>As a software engineer with a backend focus, you will work closely with your team and product management to prioritize, design, implement, test, and operate micro-services for the Databricks platform and product. This implies, among others, writing software in Scala/Java, building data pipelines (Apache Spark™, Apache Kafka), integrating with third-party applications, and interacting with cloud APIs (AWS, Azure, CloudFormation, Terraform).</p>
<p>Some example teams you can join:</p>
<p>Data Science and Machine Learning Infrastructure: Build services and infrastructure at the intersection of machine learning and distributed systems. Compute Fabric: Build the resource management infrastructure powering all the big data and machine learning workloads on the Databricks platform in a robust, flexible, secure, and cloud-agnostic way. Data Plane Storage: Deliver reliable and high-performance services and client libraries for storing and accessing humongous amounts of data on cloud storage backends, e.g., AWS S3, Azure Blob Store. Enterprise Platform: Offer a simple and powerful experience for onboarding and managing all of their data teams across 10ks of users on the Databricks platform. Observability: Provide a world-class platform for Databricks engineers to comprehensively observe and introspect their applications and services. Service Platform: Build high-quality services and manage the services in all environments in a unified way. Core Infra: Build the core infrastructure that powers Databricks, making it available across all geographic regions and Cloud providers.</p>
<p>Competencies:</p>
<p>BS/MS/PhD in Computer Science, or a related field 10+ years of production-level experience in one of: Java, Scala, C++, or similar language Comfortable working towards a multi-year vision with incremental deliverables Experience in architecting, developing, deploying, and operating large-scale distributed systems Experience working on a SaaS platform or with Service-Oriented Architectures Good knowledge of SQL Experience with software security and systems that handle sensitive data Experience with cloud technologies, e.g. AWS, Azure, GCP, Docker, Kubernetes.</p>
<p>Pay Range Transparency: The pay range(s) for this role is listed below and represents the expected salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipates utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$192,000-$260,000 USD</Salaryrange>
      <Skills>Java, Scala, C++, Apache Spark, Apache Kafka, Cloud APIs, AWS, Azure, CloudFormation, Terraform, SQL, Software security, Cloud technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks enables data teams to solve the world&apos;s toughest problems by building and running the world&apos;s best data and AI infrastructure platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/5408888002</Applyto>
      <Location>San Francisco, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6860353a-782</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, big data, AI</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461241002</Applyto>
      <Location>Washington, D.C.</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>eb3ba652-daa</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461163002</Applyto>
      <Location>San Francisco, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>9fdd877d-7dc</externalid>
      <Title>Account Executive - West</Title>
      <Description><![CDATA[<p>As an Account Executive, you will play a key role in building our India business, one of our fastest-growing markets in APJ. You will inspire and guide customers on their data journey, making organisations more collaborative and productive than ever before.</p>
<p>Your mission will be to help further build our India business, driving growth through strategic and innovative partnerships with our customers. You will assess your existing customers and develop a strategy to identify and engage all buying centres.</p>
<p>Using a solution approach to selling and creating value for customers, you will identify the most viable use cases in each account to maximise Databricks&#39; impact. You will orchestrate and work with teams to maximise the impact of the Databricks ecosystem on your territory.</p>
<p>You will close new accounts while growing our business in existing accounts. You will promote the Databricks enterprise cloud data platform and be customer-focused by delivering technical and business results using the Databricks Data Intelligence Platform.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Prospecting for new customers</li>
<li>Assessing existing customers and developing a strategy to identify and engage all buying centres</li>
<li>Using a solution approach to selling and creating value for customers</li>
<li>Identifying the most viable use cases in each account to maximise Databricks&#39; impact</li>
<li>Orchestrating and working with teams to maximise the impact of the Databricks ecosystem on your territory</li>
<li>Building value with all engagements to promote successful negotiations and close</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>5+ years of sales experience in SaaS/PaaS, or Big Data companies</li>
<li>Prior customer relationships with CIOs and important decision-makers</li>
<li>Ability to simply articulate intricate cloud technologies and big data</li>
<li>3+ years of experience exceeding sales quotas</li>
<li>Success closing new accounts while upselling existing accounts</li>
</ul>
<p>Benefits include:</p>
<ul>
<li>Comprehensive benefits and perks that meet the needs of all employees</li>
<li>A commitment to diversity and inclusion, ensuring equal employment opportunity standards</li>
</ul>
<p>If access to export-controlled technology or source code is required for performance of job duties, it is within Employer&#39;s discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Sales, Cloud Technologies, Big Data, Customer Relationships, Solution Selling</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. It has over 10,000 clients worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/6918763002</Applyto>
      <Location>Mumbai, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c7179545-496</externalid>
      <Title>Resident Solutions Architect (Professional Services)</Title>
      <Description><![CDATA[<p>We&#39;re hiring for multiple roles within our Professional Services team. As a Resident Solutions Architect, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues</li>
</ul>
<ul>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Extensive experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines</li>
</ul>
<ul>
<li>Documentation and white-boarding skills</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
</ul>
<ul>
<li>Travel to customers 10% of the time</li>
</ul>
<ul>
<li>[Preferred] Databricks Certification but not essential</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, technical project delivery, documentation, white-boarding, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company behind the Databricks Data Intelligence Platform, used by over 10,000 organisations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8367942002</Applyto>
      <Location>London, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>5a6826b9-345</externalid>
      <Title>Account Executive, Singapore</Title>
      <Description><![CDATA[<p>We are seeking a creative, driven, and execution-oriented Enterprise Account Executive to sell and maximise the huge market opportunity that exists for Databricks today.</p>
<p>As an Enterprise Account Executive reporting to the Regional Sales Director, you will have experience selling in the Enterprise segment. Your informed point of view on Big Data and Advanced Analytics will guide your successful sales strategy together with our teams and partners, allowing you to provide value to our biggest and most valued customers.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Evangelising Databricks&#39; Unified Analytics Platform powered by Spark and launching the Databricks brand in Enterprise Accounts across all industries</li>
<li>Prospecting, identifying and sourcing new sales opportunities, building pipeline individually and with the Databricks SDR team</li>
<li>Engaging with business and technical decision-makers and leading them through the evaluation and buying process</li>
<li>Exceeding individual activity, pipeline, and annual revenue targets</li>
<li>Engaging with and driving business through local partners (technology partners, ISVs, SIs, and GSIs)</li>
<li>Driving customer success and upselling existing customers</li>
<li>Creating a Territory Strategy across all industries</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>5+ years of experience selling SaaS solutions to Enterprise customers in ASEAN region</li>
<li>Proven success in Enterprise Sales roles, ideally in big data, Cloud, or SaaS technology</li>
<li>Demonstrable experience in selling innovation, ideally in big data, Cloud, or SaaS technology</li>
<li>Solution and business-outcomes-focused sales approach</li>
<li>Ability to simply articulate intricate cloud &amp; big data technologies and their business value for the customer</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SaaS solutions, Enterprise sales, Big data, Cloud technology, Sales strategy, Customer success, Territory strategy</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified data intelligence platform to over 10,000 organisations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/7726495002</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3827f936-fc2</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>Job Title: Resident Solutions Architect - Financial Services</p>
<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
</ul>
<ul>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have: Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p>About Databricks</p>
<p>Databricks is the data and AI company. More than 10,000 organizations worldwide , including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 , rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI.</p>
<p>Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow.</p>
<p>To learn more, follow Databricks on Twitter, LinkedIn and Facebook.</p>
<p>Benefits</p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees.</p>
<p>For specific details on the benefits offered in your region click here.</p>
<p>Our Commitment to Diversity and Inclusion</p>
<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel.</p>
<p>We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards.</p>
<p>Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.</p>
<p>Compliance</p>
<p>If access to export-controlled technology or source code is required for performance of job duties, it is within Employer&#39;s discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, Cloud ecosystems, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461326002</Applyto>
      <Location>New York City, New York</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>9223ca6d-d9e</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, Python, Scala, CI/CD, MLOps, distributed computing</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461193002</Applyto>
      <Location>Seattle, Washington</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>142560d9-0be</externalid>
      <Title>Pre-sales Manager, Field Engineering - Named Accounts</Title>
      <Description><![CDATA[<p>As a Field Engineering Manager, you will lead a skilled and motivated team of technical pre-sales Solutions Architects and Solutions Engineers based in Denmark.</p>
<p>You will play a key leadership role at the intersection of technology and business, helping the team design value-focused solutions and support growth through effective collaboration with Sales and other partners.</p>
<p>You will coach your team to communicate complex ideas clearly and accessibly, supporting strategic sales cycles, and building strong partnerships with customers and stakeholders across large organisations.</p>
<p>The impact you will have:</p>
<ul>
<li>Lead, mentor, and develop an inclusive pre-sales team in Denmark, supporting both performance and long-term career growth.</li>
</ul>
<ul>
<li>Foster a collaborative culture that reflects Databricks&#39; values of customer focus, teamwork, and diversity of thought.</li>
</ul>
<ul>
<li>Partner with Sales to drive solution- and value-based selling, helping customers understand the business impact of our platform.</li>
</ul>
<ul>
<li>Enhance team effectiveness by establishing and iterating on best practices that improve efficiency and focus on meaningful business outcomes.</li>
</ul>
<ul>
<li>Build trusted relationships with customer executives and strategic partners, acting as a credible advisor on data, analytics, and AI.</li>
</ul>
<ul>
<li>Collaborate across teams,including Marketing, Sales, Product, and Services,to ensure a seamless customer experience and successful implementation.</li>
</ul>
<ul>
<li>Represent Databricks as part of the regional leadership team and contribute to building our brand and presence in Nordic markets.</li>
</ul>
<p>What we&#39;re looking for:</p>
<ul>
<li>Experience managing or leading technical teams in Big Data, Cloud, SaaS, or similarly complex technology environments, or equivalent experience that shows you can quickly learn new technologies.</li>
</ul>
<ul>
<li>A track record of leading, mentoring, and scaling pre-sales, consulting, or technical field organisations, with an emphasis on coaching and inclusive leadership.</li>
</ul>
<ul>
<li>Understanding of consumption-based or subscription business models and how technical decisions influence commercial outcomes.</li>
</ul>
<ul>
<li>Ability to collaborate effectively with Sales and other cross-functional leaders, balancing customer value, technical feasibility, and business priorities.</li>
</ul>
<ul>
<li>Strong communication and interpersonal skills, with fluency in English required and Danish preferable.</li>
</ul>
<ul>
<li>Technical background in Data Engineering, Databases, Data Science, or adjacent fields; consulting or customer-facing experience is an advantage.</li>
</ul>
<ul>
<li>Curiosity and passion for data, AI, and cloud technologies, and the ability to articulate their business value in clear, inclusive language.</li>
</ul>
<p>Notes:</p>
<ul>
<li>This will be a &#39;Remote Contract&#39; with no formal Databricks office currently in Denmark. However the team do meet weekly/bi-weekly to meet in a coworking space in Copenhagen.</li>
</ul>
<ul>
<li>If you need any adjustments or accommodations during the application or interview process, please let us know so we can support you.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Big Data, Cloud, SaaS, Data Engineering, Databases, Data Science, Leadership, Communication, Interpersonal skills, AI, Cloud technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8506061002</Applyto>
      <Location>Remote - Denmark</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8d65cea1-fd1</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461219002</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>507bea17-ad7</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, distributed computing, Python, Scala, AWS, Azure, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461251002</Applyto>
      <Location>Mountain View, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>4d67ce9b-156</externalid>
      <Title>Senior Product Manager, Data Enablement</Title>
      <Description><![CDATA[<p>Omada Health is on a mission to inspire and engage people in lifelong health, one step at a time.\n\nAs the Senior Product Manager for Data Enablement, you will be the enterprise-level owner of analytics capabilities, shared definitions, and enablement tooling that ensures Omada can answer critical business questions faster and more efficiently. This role will work across Product, Clinical, Commercial, Operations, and Finance to create the structures for how analytics are governed, accessed, and used.\n\nThe primary mandate of this role is to own the product strategy and roadmap for Omada’s enterprise analytics including governance, shared definitions, and enablement tooling. You are accountable for ensuring that teams across Product, Clinical, Commercial, Operations, and Finance can ask and answer critical business questions quickly, consistently, and with high trust in the underlying data. This means defining how key concepts are governed, how analytics tools are shaped and prioritized, and how cross-functional stakeholders engage with the data platform.\n\nYou will act as the central product owner for Omada’s analytics ecosystem, setting clear standards and driving adoption across the company. In practice, this includes:\n\n* Owning the roadmap for core data products and analytics tooling that power dashboards, reporting, and self-service analysis.\n* Driving the culture of data across our engineering, product and other business teams at Omada\n* Partnering with data, engineering, analytics, and business leaders to ensure analytics investments are aligned with company strategy and deliver measurable impact.\n* Collaborating with other product managers to incorporate data science concepts into their features &amp; experiences\n\nAbout you:\n\n* 10+ years of relevant product management or data product experience (8+ years with Master&#39;s degree, 5+ years with PhD)\n* 7+ years of experience with data technology and data management, including familiarity with:\n  * Modern data warehousing technologies (Redshift, Snowflake, BigQuery or similar)\n  * Cloud technologies, preferably AWS\n  * Business intelligence and analytics tools (Tableau, Amplitude, Looker, or similar)\n  * Data governance frameworks and data quality management\n  * SQL and ability to write queries and QA datasets\n* Subject matter expertise in enterprise analytics governance, data product management, or analytics enablement\n* Experience establishing governance processes &amp; associated tool implementation for data definitions, metrics, or analytics across multiple business functions\n* Proven ability to partner with Data Teams for technical evaluation of self service tools and/or statistical model training lifecycle\n* Proven ability to influence senior stakeholders and reconcile opposing viewpoints to drive consensus\n* Track record of working on problems that are not clearly defined, using advanced knowledge and conceptual thinking to develop solutions\n* Experience with healthcare data standards and regulatory requirements is strongly preferred\n* Bachelor&#39;s degree or equivalent relevant experience (advanced degree preferred)\n\nEssential Competencies:\n\n* Strategic &amp; Analytical:\n  * Able to think beyond constraints to imagine new ways to use data to impact members and customers\n  * Excellent organizational and analytical skills with strong technical understanding\n  * Comfortable analyzing data and leading research/discovery efforts to understand problem spaces, identify opportunities, and propose solutions\n  * Proven ability to set clear, achievable objectives and work plans for complex, cross-functional initiatives\n* Communication &amp; Influence:\n  * Exceptional written and verbal communication skills\n  * Ability to create formal networks with key decision makers and influence stakeholders across multiple functions\n  * Skilled at adapting communication style for different audiences, from technical teams to senior executives\n  * Experience negotiating matters of significance with senior management and/or major customers\n* Collaboration &amp; Problem Solving:\n  * Comfortable with ambiguity and adept at solving problems with limited resources and information\n  * Able to drive alignment among diverse stakeholders with competing priorities\n  * Great listener who asks insightful questions and doesn&#39;t accept &quot;It&#39;s how we&#39;ve always done it&quot; as an answer\n  * Proven ability to reconcile various and opposing stakeholder views to drive results\n* Execution &amp; Delivery:\n  * History of consistently delivering results against target metrics and commitments\n  * Experience managing product backlogs, writing clear product specifications, and driving execution\n  * Ability to exercise autonomous decision making with limited input on methods and techniques to obtain results\n\nYour impact:\n\nIn this role, you will directly contribute to transforming how Omada uses data to serve members, customers, and internal stakeholders. Your work will:\n\n* Eliminate fragmentation: Establish single sources of truth for critical business and clinical metrics, reducing confusion and accelerating decision-making\n* Increase trust: Build confidence in analytics across the organization through governed definitions and transparent processes\n* Accelerate insights: Enable analysts and business users to answer questions faster through improved tooling, AI-assisted analytics, and reusable data products\n* Scale effectively: Create governance frameworks and enablement systems that scale as Omada&#39;s product portfolio and customer base expand\n* Drive outcomes: Ensure that data and analytics capabilities directly support better health outcomes for members and business results for customers\n\nBenefits:\n\n* Competitive salary with generous annual cash bonus\n* Equity grants\n* Remote first work from home culture\n* Flexible Time Off to help you rest, recharge, and connect with loved ones\n* Generous parental leave\n* Health, dental, and vision insurance (and above market employer contributions)\n* 401k retirement savings plan\n* Lifestyle Spending Account (LSA)\n* Mental Health Support Solutions\n* ...and more!\n\nWe take a village to change healthcare. As we build together toward our mission, we strive to embody the following values in our day-to-day work. We hope these hold meaning for you as well as you consider Omada!\n\n* Cultivate Trust. We listen closely and we operate with kindness. We provide respectful and candid feedback to each other.\n* Seek Context. We ask to understand and we build connections. We do our research up front to move faster down the road.\n* Act Boldly. We innovate daily to solve problems, improve processes, and find new opportunities for our members and customers.\n* Deliver Results. We reward impact above output.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data technology, data management, modern data warehousing technologies, cloud technologies, business intelligence and analytics tools, data governance frameworks, data quality management, SQL, subject matter expertise in enterprise analytics governance, data product management, analytics enablement</Skills>
      <Category>Engineering</Category>
      <Industry>Healthcare</Industry>
      <Employername>Omada Health</Employername>
      <Employerlogo>https://logos.yubhub.co/omadahealth.com.png</Employerlogo>
      <Employerdescription>Omada Health is a healthcare technology company that provides virtual-first care for pre-diabetes, diabetes, hypertension, and musculoskeletal conditions.</Employerdescription>
      <Employerwebsite>https://www.omadahealth.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/omadahealth/jobs/7759052</Applyto>
      <Location>Remote, USA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>e8554fd8-b88</externalid>
      <Title>Senior Commercial Legal Counsel (MENA)</Title>
      <Description><![CDATA[<p>We are seeking a Senior Commercial Legal Counsel to join our Legal team and focus primarily on Contracts &amp; Commercial topics in the MENA region.</p>
<p>As a member of our legal team, you will be instrumental in supporting our regional MENA business in drafting our contracts, agreements and terms to enable deal making and revenue generation. You will also drive contracts negotiations and help contracts adaptation to the evolving AI regulation and compliance.</p>
<p>The role reports into our Director, commercial legal affairs based in Paris. The legal team currently comprises 25 persons across Paris, London, Singapore and Palo Alto, covering a spectrum of competences across regulatory, IP, contracts and corporate.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Drafting, updating and providing strategic advice on a wide range of contracts, including but not limited to AI-related service agreements.</li>
<li>Negotiating such contracts with suppliers, partners, customers and other stakeholders.</li>
<li>Staying updated on the latest products and AI-related legal and regulatory developments, including when coming from customers&#39; fields, and providing guidance to their translation into contract adaptation and negotiation.</li>
<li>Providing training and support to the business on legal and compliance matters related to AI, and on contract negotiation.</li>
<li>Helping on other legal topics as needed (labour law, corporate, competition law, etc.).</li>
</ul>
<p>About you:</p>
<ul>
<li>Master degree in law, a bar admission is a plus, as well as an experience in a fast-growing environment.</li>
<li>A minimum of 10 years of relevant legal experience, in a related technology or AI-focused environment involving contract negotiations.</li>
<li>Several years of experience working for KSA and UAE, with numerous deals done in each countries, a solid knowledge of the regional framework of the relevant area of law (data privacy, IP, contract law, liability, compliance, export controls) and a mastery of the regional do and don&#39;ts of doing business in the region.</li>
<li>Successful track record dealing with Public Sector &amp; Semi Government clients</li>
<li>Contracts negotiation experience implying complex technologies, especially both cloud and on-prem, impacting legal terms, and ability to understand legal implications</li>
<li>Understanding of artificial intelligence technologies and their applications is essential</li>
<li>Good knowledge of the relevant area of law, such as data privacy, intellectual property, contract law and liability, regulatory and compliance</li>
<li>Ability to analyze complex legal and technical issues related to AI and to identify potential legal risks and solutions is important</li>
<li>Excellent written and verbal communication; Strong ability to present complex legal concepts in a clear and understandable manner</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Contract negotiation, AI-related service agreements, Data privacy, Intellectual property, Contract law, Liability, Regulatory compliance, Export controls, Cloud technologies, On-prem technologies</Skills>
      <Category>Legal</Category>
      <Industry>Technology</Industry>
      <Employername>Mistral AI</Employername>
      <Employerlogo>https://logos.yubhub.co/mistral.ai.png</Employerlogo>
      <Employerdescription>Mistral AI is a technology company that develops and provides AI solutions. It has a diverse workforce with teams distributed across several countries.</Employerdescription>
      <Employerwebsite>https://mistral.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/mistral/de3db0dd-972a-4201-b2ff-a9d41371f53c</Applyto>
      <Location>Abu Dhabi</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>506244a1-b6b</externalid>
      <Title>Staff Software Engineer - Backend</Title>
      <Description><![CDATA[<p>About Cyngn</p>
<p>Cyngn is a publicly-traded autonomous technology company that deploys self-driving industrial vehicles to factories, warehouses, and other facilities throughout North America.</p>
<p>We are a welcoming, diverse team of sharp thinkers and kind humans. Collaboration and trust drive our creative environment. At Cyngn, everyone&#39;s perspective matters,and that&#39;s what powers our innovation.</p>
<p>About this Role</p>
<p>As the Staff Software Engineer for our SaaS platform team, you will be crucial in developing Cyngn&#39;s cutting-edge fleet management system for autonomous industrial vehicles. You&#39;ll collaborate with product and engineering teams to design, implement, deploy, and maintain a robust cloud-based solution that enables real-time control and monitoring of autonomous vehicles in the field.</p>
<p>Responsibilities</p>
<ul>
<li>Architect and lead the development of a sophisticated, cloud-native fleet management system capable of real-time control and monitoring of numerous autonomous vehicles</li>
<li>Design and implement scalable, distributed systems that can handle high-volume, real-time data processing and decision-making</li>
<li>Develop robust APIs and microservices to support integration with various autonomous vehicle platforms and customer systems</li>
<li>Create efficient algorithms for route optimization, task scheduling, and resource allocation across vehicle fleets</li>
<li>Implement advanced data analytics and machine learning capabilities to provide predictive maintenance, performance optimization, and business intelligence features</li>
<li>Ensure system reliability, security, and compliance with industry standards and regulations</li>
<li>Lead a team of skilled engineers, fostering a culture of innovation, code quality, and continuous improvement</li>
<li>Collaborate with product managers, UX designers, and customers to translate business requirements into technical solutions</li>
<li>Mentor junior developers and contribute to the technical growth of the engineering team</li>
<li>Participate in the entire software development lifecycle, from concept and design to testing, deployment, and maintenance</li>
<li>Engage with customers to understand their needs and translate them into technical requirements</li>
</ul>
<p>Who You Are</p>
<ul>
<li>10+ years of software development experience, with a strong focus on backend systems and distributed architectures</li>
<li>Extensive experience in building and scaling cloud-native SaaS platforms, preferably in the IoT or robotics domains</li>
<li>Expert-level proficiency in at least one of Python, Go, Java, or C++, with working knowledge of others</li>
<li>Deep understanding of cloud technologies and services (AWS, Azure, or GCP)</li>
<li>Proven experience with event-driven architectures and message queuing systems (e.g., Kafka, RabbitMQ, Apache Pulsar)</li>
<li>Strong background in database design and optimization, including both SQL and NoSQL solutions</li>
<li>Proficiency in developing scalable WebSocket-based real-time communication systems</li>
<li>Expertise in developing real-time data processing pipelines and analytics systems</li>
<li>Proficiency with containerization and orchestration technologies (Docker, Kubernetes)</li>
<li>Experience with infrastructure-as-code and CI/CD practices (e.g., Terraform, GitOps)</li>
<li>Track record of leading development teams and mentoring junior developers</li>
<li>Excellent problem-solving skills and ability to optimize complex systems</li>
<li>Strong communication skills and ability to explain technical concepts to non-technical stakeholders</li>
<li>Strong collaboration skills with a low ego</li>
</ul>
<p>Nice to Have</p>
<ul>
<li>Experience with IoT platforms, robotics frameworks (e.g., ROS), or autonomous vehicle technologies</li>
<li>Solid understanding of network protocols and communication standards relevant to IoT and autonomous systems</li>
<li>Knowledge of AMQP, MQTT, DDS, or other IoT-specific communication protocols</li>
<li>Experience with time-series databases (e.g., InfluxDB, TimescaleDB) for handling large volumes of sensor data</li>
<li>Familiarity with edge computing and fog architectures</li>
<li>Experience with real-time operating systems and embedded software development</li>
<li>Expertise in building and maintaining scalable API platforms</li>
<li>Experience with geospatial data processing and mapping technologies</li>
<li>Knowledge of fleet management software and vehicle telematics systems</li>
<li>Experience with simulation environments for autonomous systems testing</li>
</ul>
<p>Benefits &amp; Perks</p>
<ul>
<li>Health benefits (Medical, Dental, Vision, HSA and FSA (Health &amp; Dependent Daycare), Employee Assistance Program, 1:1 Health Concierge)</li>
<li>Life, Short-term, and long-term disability insurance (Cyngn funds 100% of premiums)</li>
<li>Company 401(k)</li>
<li>Commuter Benefits</li>
<li>Flexible vacation policy</li>
<li>Remote or hybrid work opportunities</li>
<li>Sabbatical leave opportunity after five years with the company</li>
<li>Paid Parental Leave</li>
<li>Daily lunches for in-office employees</li>
<li>Monthly meal and tech allowances for remote employees</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$198,000-225,000 per year</Salaryrange>
      <Skills>backend systems, distributed architectures, cloud-native SaaS platforms, Python, Go, Java, C++, cloud technologies, event-driven architectures, message queuing systems, database design, optimization, WebSocket-based real-time communication systems, containerization, orchestration technologies, infrastructure-as-code, CI/CD practices, IoT platforms, robotics frameworks, autonomous vehicle technologies, network protocols, communication standards, AMQP, MQTT, DDS, time-series databases, edge computing, fog architectures, real-time operating systems, embedded software development, scalable API platforms, geospatial data processing, mapping technologies, fleet management software, vehicle telematics systems, simulation environments</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Cyngn</Employername>
      <Employerlogo>https://logos.yubhub.co/cyngn.com.png</Employerlogo>
      <Employerdescription>Cyngn is a publicly-traded autonomous technology company that deploys self-driving industrial vehicles to factories, warehouses, and other facilities throughout North America.</Employerdescription>
      <Employerwebsite>https://www.cyngn.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/cyngn/a343ef0b-d8de-4195-ba96-251c479ff354</Applyto>
      <Location>Mountain View</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>f4e69181-3d1</externalid>
      <Title>Appian Developer</Title>
      <Description><![CDATA[<p>We are looking for a highly skilled Appian Developer with 6+ years of strong technical experience in Appian development, database technologies, cloud integrations, and DevOps practices. The ideal candidate should be capable of designing scalable solutions, supporting applications at platform and application levels, and working effectively in a global, multi-cultural environment.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Design, develop, and implement Appian applications using Appian SAIL, integrations, and best practices.</li>
<li>Write, optimize, and debug database queries and design scalable database solutions.</li>
<li>Develop and test APIs for seamless system integration.</li>
<li>Implement and maintain CI/CD pipelines following DevOps methodologies.</li>
<li>Integrate Appian applications with GCP cloud services using APIs and other integration approaches.</li>
<li>Work with Oracle SQL, PostgreSQL, MariaDB, and other database technologies to develop robust solutions.</li>
<li>Contribute to cloud adoption initiatives involving GCP or AWS.</li>
<li>Support applications at both platform and application levels.</li>
<li>Design and develop integrations with third-party systems.</li>
<li>Work collaboratively in Agile Scrum teams; utilize tools like JIRA for tracking and delivery.</li>
<li>Provide technical guidance and mentorship to junior developers.</li>
<li>Collaborate with stakeholders across global teams with strong communication, documentation, and presentation skills.</li>
<li>Use common shell commands and scripting for automation or troubleshooting.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>6+ years of strong technical experience with Appian Development.</li>
<li>Hands-on experience with SAIL, SQL, Appian Integrations, Mule APIs, and related tools.</li>
<li>Strong experience with Oracle SQL, PostgreSQL, MariaDB databases.</li>
<li>Knowledge or eagerness to self-learn BPM tools like Appian.</li>
<li>Experience with API development and testing.</li>
<li>Experience integrating systems with GCP Cloud services.</li>
<li>Knowledge of cloud technologies such as GCP / AWS (services, databases, integration patterns).</li>
<li>Experience across different Java platforms.</li>
<li>Familiarity with DevOps CI/CD pipelines and tools.</li>
<li>Strong understanding of Agile Scrum methodology and tools like JIRA.</li>
<li>Strong analytical, communication, and stakeholder management skills.</li>
<li>Ability to work in a multi-cultural, global team environment.</li>
<li>Ability to work independently, handle pressure, and balance multiple priorities.</li>
</ul>
<p><strong>Benefits</strong></p>
<p>Competitive compensation and benefits package:</p>
<ol>
<li>Competitive salary and performance-based bonuses</li>
<li>Comprehensive benefits package</li>
<li>Career development and training opportunities</li>
<li>Flexible work arrangements (remote and/or office-based)</li>
<li>Dynamic and inclusive work culture within a globally renowned group</li>
<li>Private Health Insurance</li>
<li>Pension Plan</li>
<li>Paid Time Off</li>
<li>Training &amp; Development</li>
</ol>
<p>Note: Benefits differ based on employee level.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Appian Development, SAIL, SQL, Appian Integrations, Mule APIs, Oracle SQL, PostgreSQL, MariaDB, API development, GCP Cloud services, AWS, Java platforms, DevOps CI/CD pipelines, Agile Scrum methodology, JIRA, Cloud adoption initiatives, Cloud technologies, Database technologies, Shell commands and scripting</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The company has a strong 55-year heritage and deep industry expertise.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/6j8Tj5mXCxm5kzNkRkj8b7/hybrid-appian-developer-in-pune-at-capgemini</Applyto>
      <Location>Pune, Maharashtra, India</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7dd68303-658</externalid>
      <Title>Applied AI Engineer</Title>
      <Description><![CDATA[<p><strong>Job Description</strong></p>
<p>Fuse Energy is a forward-thinking renewable energy startup on a mission to deliver a terawatt of renewable energy - fast. We&#39;re combining first-principles thinking with cutting-edge technology to build a radically better energy system.</p>
<p>We&#39;re creating a fully integrated energy company: from developing solar, wind and hydrogen projects to real-time power trading and distributed energy installations. By selling directly to consumers, we cut out the middleman, lower costs and pass on savings to customers.</p>
<p>But we’re not stopping there. We’re also building the Energy Network: a decentralised platform of smart devices that rewards users in Energy Dollars for electrifying their homes, shifting usage to off-peak hours, and helping balance the grid. This network strengthens grid stability - a critical foundation for scaling AI data centers and other energy-intensive industries.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design, develop and deploy AI-powered features that directly impact consumer experiences, including personalised energy recommendations and seamless onboarding via AI models.</li>
<li>Build and optimise internal AI tools that will make the whole company more productive with a focus on automation and enhancing workflows.</li>
<li>Collaborate with backend engineers and data scientists to integrate AI-driven features into our platforms.</li>
<li>Collaborate with the trading and operations teams to ensure AI models are aligned with real-time market conditions and energy pricing.</li>
<li>Improve AI models to optimise trading strategies by anticipating market shifts based on weather and demand forecasts.</li>
<li>Stay up to date with the latest advancements in applied AI and machine learning and apply them to solve real-world problems within the energy space.</li>
<li>Monitor the performance of AI tools and models, ensuring they are functioning efficiently and effectively.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Minimum 3 years of engineering experience</li>
<li>Proven experience as a Backend Engineer with a strong interest and practical experience in applied AI or machine learning</li>
<li>Strong programming skills in Python (or similar languages) with familiarity in AI/ML libraries (TensorFlow, PyTorch, etc.)</li>
<li>Experience working with large-scale models (LLMs/VLMs) and deploying AI-driven solutions into production</li>
<li>Solid understanding of cloud technologies, containerisation and building scalable AI applications</li>
<li>Ability to integrate AI/ML models into real-world applications, focusing on usability and performance</li>
<li>Strong problem-solving skills and a practical approach to implementing AI solutions in a fast-paced environment</li>
<li>Experience working with large datasets, particularly in relation to demand and supply forecasting</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary and an equity sign-on bonus</li>
<li>Biannual bonus scheme</li>
<li>Fully expensed tech to match your needs</li>
<li>Paid annual leave</li>
<li>Breakfast and dinner allowance for office based employees</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, TensorFlow, PyTorch, Cloud technologies, Containerisation, Scalable AI applications, Large-scale models, AI/ML libraries, Energy markets, Trading strategies, Weather forecasting, Energy demand patterns, Production modelling, Natural Language Processing</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Fuse Energy</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Fuse Energy is a renewable energy startup with a mission to deliver a terawatt of renewable energy. It has raised $170M from top-tier investors.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/kGPL8Jr9qCPDq7bxuz8Y2f/hybrid-applied-ai-engineer-in-dubai-at-fuse-energy</Applyto>
      <Location>Dubai</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>4eb8e004-e19</externalid>
      <Title>Java Engineer, Associate</Title>
      <Description><![CDATA[<p>At BlackRock, we&#39;re looking for a Java Engineer to join our team. As a Java Engineer, you&#39;ll be working on our investment operating system Aladdin, which is used by financial institutions worldwide. You&#39;ll be part of the Aladdin Engineering team, responsible for building the next generation of technology that changes the way information, people, and technology intersect for global investment firms.</p>
<p>Responsibilities:</p>
<ul>
<li>Take ownership of individual project priorities, deadlines, and deliverables using AGILE methodologies.</li>
<li>Deliver high-efficiency, high-availability, concurrent, and fault-tolerant software systems.</li>
<li>Contribute to the development of Aladdin&#39;s global, multi-asset trading platform.</li>
<li>Design and develop innovative solutions to complex problems, identifying issues and roadblocks.</li>
<li>Demonstrate vision when brainstorming solutions for team productivity, efficiency, guiding, and motivating developers.</li>
</ul>
<p>Qualifications:</p>
<ul>
<li>Master&#39;s Degree or PhD in Computer Science, Engineering, or Mathematics.</li>
<li>Hands-on experience in Java or web development (JavaScript).</li>
<li>Good understanding of concurrent programming and design of high-throughput, high-availability, fault-tolerant distributed applications and databases.</li>
<li>Strong interest in distributed systems, infrastructure services, cloud technology, and AI/ML techniques and technology.</li>
<li>Prior experience in building distributed applications using SQL and/or NoSQL technologies such as MSSQL, MongoDB, Snowflake, or Redis is a plus.</li>
<li>Prior experience with message broker technology such as Kafka or gRPC is a plus.</li>
<li>Prior experience in modern front-end frameworks such as React, Vue.js, or Angular is a plus. Angular preferred.</li>
<li>Excellent analytical and software architecture design skills, with an emphasis on test-driven development.</li>
<li>Effective communication and presentation skills, both written and verbal</li>
</ul>
<p>Our benefits include retirement investment and tools designed to help you build a sound financial future, access to education reimbursement, comprehensive resources to support your physical health and emotional well-being, family support programs, and Flexible Time Off (FTO) so you can relax, recharge, and be there for the people you care about.</p>
<p>Our hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Java, web development, concurrent programming, distributed systems, infrastructure services, cloud technology, AI/ML techniques and technology, SQL, NoSQL technologies, message broker technology, modern front-end frameworks, Angular, React, Vue.js</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>BlackRock</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>BlackRock is a global investment management company with over $12 trillion in assets under management.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/u3TPbEc65jhur5x3zbCSL7/java-engineer%2C-associate-in-edinburgh-at-blackrock</Applyto>
      <Location>Edinburgh, Scotland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>6efafbcb-51b</externalid>
      <Title>Support Specialist</Title>
      <Description><![CDATA[<p>As a Support Specialist in our Auckland team, your mission is to become an expert in Vista Group software, playing a key role in customer service and technical support. This is a fast-paced role where you will prioritise multiple tasks, troubleshoot effectively, and provide excellent communication to internal and external stakeholders.</p>
<p>We&#39;d expect you to be a self-starter, needing to seek guidance or instruction infrequently. We&#39;d like to see you starting to practice some basic leadership capabilities, and readily taking proactive measures to prevent software issues. You will consistently achieve business goals and metrics successfully, promptly, and accurately.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Investigate, track, find solutions, and resolve support issues and questions</li>
<li>Test solutions before deploying to production, understand program code and error logs, and log software bugs, in collaboration with clients, team members, vendors, and third parties around the world in various roles</li>
<li>Install and configure Vista Group software remotely and onsite, as part of troubleshooting, resolving issues or onboarding, as needed</li>
<li>Perform cross-regional and cross-departmental tasks and initiatives as assigned either on an ad-hoc, temporary basis, or as part of a more formal rotation program</li>
<li>Become knowledgeable of Vista Group software and the cinema industry to effectively serve clients</li>
<li>Upskill and train clients, team members, vendors, and third parties via various methods such as creating documentation and hosting training sessions</li>
<li>Build strong relationships with clients and team members</li>
<li>Be available on rotation with a cell phone for urgent after-hours support issues</li>
<li>Travel domestically and internationally on occasion</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Specific knowledge of SQL Server, Visual Basic, .NET, cloud technologies, and networking</li>
<li>Excellent problem-solving and analytical skills, verbal and written communication, and organisational skills with meticulous attention to detail</li>
<li>Proven ability to remain composed under pressure, use independent discretion and judgment to assess situations, and respond appropriately</li>
<li>A commitment to excellent customer service and dedication to creating great outcomes and working relationships</li>
<li>Proficient in Microsoft Office (Outlook, Word, Excel, PowerPoint, etc...)</li>
<li>Current valid passport or able to obtain one promptly if required</li>
<li>Undergraduate degree or certifications in related technical area would be preferred</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Excellent work/life balance including a 4 ½ day working week</li>
<li>Hybrid working (home and office based split, requiring regular weekly attendance in the Auckland office)</li>
<li>Medical and Life insurance</li>
<li>Volunteer day, enhanced paid parental leave and wellness benefits</li>
<li>Strong mentoring &amp; career development focus</li>
<li>Fun team events including the Vista Innovation Cup</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL Server, Visual Basic, .NET, cloud technologies, networking, Microsoft Office</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Vista Group</Employername>
      <Employerlogo>https://logos.yubhub.co/j.com.png</Employerlogo>
      <Employerdescription>Vista Group is a world-leading company that makes software for the cinema industry, serving cinemas, film distributors, and moviegoers worldwide.</Employerdescription>
      <Employerwebsite>https://apply.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/CC38DC05FF</Applyto>
      <Location>Auckland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>4d7a527c-bc2</externalid>
      <Title>Senior Developer Experience Engineer</Title>
      <Description><![CDATA[<p><strong>Job Title: Senior Developer Experience Engineer</strong></p>
<p><strong>Company: Zoopla</strong></p>
<p><strong>Job Summary:</strong></p>
<p>As a Senior Developer Experience Engineer at Zoopla, you will work on the Developer Experience side of the engineering organisation. You will be responsible for building and maintaining the tools and infrastructure that enable developers to work efficiently and effectively.</p>
<p><strong>Job Description:</strong></p>
<p>Zoopla is one of the UK&#39;s most instantly recognisable property brands. We&#39;re known and loved by over 91% of the nation (and we&#39;re working hard on the other 9%). Our mission is to help the nation make better home decisions - by connecting everyone to their home and giving them personalised insights to help with moving, managing or financing.</p>
<p>We&#39;re a growing team that embraces innovation and isn&#39;t afraid to push the boundaries. We&#39;re only just starting our journey to redefine the digital property landscape, with much more to explore and achieve.</p>
<p><strong>Requirements:</strong></p>
<ul>
<li>You&#39;re skilled in Next.js and TypeScript.</li>
<li>You like working with AI and you have built things to leverage its usage (think CLIs, custom agents, workflows, anything related to automating and simplifying processes).</li>
<li>You&#39;re working on your engineering craft, and you practise writing maintainable code, reviewing code, pair programming at times, and writing tests.</li>
<li>You&#39;re commercially driven, resilient, and able to respond to changes without missing a beat.</li>
<li>You&#39;re passionate about building products to delight users, while also paying attention to building products that are stable, scalable, secure, observable, and performant.</li>
<li>You&#39;ll have worked with some type of cloud tech, though we&#39;ll make sure you&#39;re supported by our SRE folks when you need to flex across the whole stack.</li>
</ul>
<p><strong>Benefits:</strong></p>
<ul>
<li>25 days annual leave + extra days for years of service</li>
<li>Day off for volunteering &amp; Digital detox day</li>
<li>Festive Closure - business closed for period between Christmas and New Year</li>
<li>Cycle to work and electric car schemes</li>
<li>Free Calm App membership</li>
<li>Enhanced Parental leave</li>
<li>Fertility Treatment Financial Support</li>
<li>Group Income Protection and private medical insurance</li>
<li>Gym on-site in London</li>
<li>7.5% pension contribution by the company</li>
<li>Discretionary annual bonus up to 10% of base salary</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Next.js, TypeScript, AI, CLIs, custom agents, workflows, cloud tech, Golang, Python, AWS, GitHub/GitLab, serverless technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Zoopla</Employername>
      <Employerlogo>https://logos.yubhub.co/j.com.png</Employerlogo>
      <Employerdescription>Zoopla is a UK-based property brand that provides data and information on every UK property, with over 50 million monthly visitors.</Employerdescription>
      <Employerwebsite>https://apply.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/61EA325FF4</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>828350e2-410</externalid>
      <Title>Software Engineer - Platform Services</Title>
      <Description><![CDATA[<p>You will be part of the Platform Development team, reporting to a senior manager, and your role will be to participate in the planning and development of our Infrastructure platforms. This includes conducting assessments and gathering requirements, designing system improvements, and integrating new technologies.</p>
<p><strong>Responsibilities</strong></p>
<p>You will play an integral role in the team responsible for the major build-out of functionality on the infrastructure platforms, services and associated backend infrastructure. You will collaborate with cross-functional teams to define, design, and deliver new features. You will manage specific aspects of the buildout, which includes planning and communicating with senior stakeholders.</p>
<p>You will prepare documentation, provide guidance, conduct code reviews, and offer mentorship to elevate the skills of the team and the quality of our projects. You will continuously explore and implement improvements to enhance efficiency and performance. You will troubleshoot and resolve issues across various environments and platforms.</p>
<p><strong>Qualifications</strong></p>
<p>You will have a comprehensive understanding of software architecture and design patterns. You will have minimum 5 years of expertise in Python, including at least one popular Python framework. You will have minimum 5 years of practical experience with web frameworks for building RESTful APIs in Python, such as Django or FastAPI.</p>
<p>You will have strong problem-solving skills with a proactive approach to addressing challenges. You will have minimum 5 years of practical experience in DevOps and Infrastructure roles. You will have experience in CI/CD pipeline design and automation tools. You will have familiarity with front-end technologies, such as JavaScript, HTML5 and CSS.</p>
<p>You will have experience with SQL/no-SQL databases. You will have experience with Cloud Technologies, such as Azure or AWS. You will have experience with containerization and orchestration technologies, like Docker or Kubernetes. You will have familiarity with Linux systems, equivalent to LPIC-2.</p>
<p>You will have knowledge of security best practices in software development. You will have familiarity with general IT infrastructure topics: Virtualization, Storage, Networking, etc.</p>
<p>This is a hybrid role, with 3 days per week working from our office in Bucharest.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, Django, FastAPI, DevOps, Infrastructure, CI/CD pipeline design, Automation tools, Front-end technologies, JavaScript, HTML5, CSS, SQL/no-SQL databases, Cloud Technologies, Azure, AWS, Containerization, Orchestration, Docker, Kubernetes, Linux systems, LPIC-2, Security best practices</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts is a digital entertainment company that creates next-level entertainment experiences. It has a diverse portfolio of games and experiences.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Software-Engineer-Platform-Services/211319</Applyto>
      <Location>Bucharest</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>8a7a8279-e87</externalid>
      <Title>Software Engineer - Frostbite</Title>
      <Description><![CDATA[<p>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. As a Software Engineer at Frostbite, you will join our Core Services team and report to the Development Manager with the mission to improve the productivity and efficiency of the wider Frostbite and game development teams.</p>
<p>Responsibilities:</p>
<ul>
<li>Work with a team of software engineers, quality designers and data analysts to develop new workflows, and frameworks to take Frostbite into a new era of AI-based workflows.</li>
<li>Identify new opportunities to improve Frostbite quality processes, and will lead the design, implementation, review and delivery of large functionality.</li>
<li>Ensure that code is tested and of high-quality, and that our outcomes are measurable and documented.</li>
<li>Debug technical issues and prevent reoccurrences.</li>
<li>Contribute to our team culture of learning, collaboration, sustainability, and promoting engineering best practices.</li>
</ul>
<p>Qualifications:</p>
<ul>
<li>BSc degree in Computer Science, Software Engineering, a related academic programme, or equivalent training and experience.</li>
<li>5+ years of professional experience in C# or C++.</li>
<li>Experience with the software engineering lifecycle of a complex codebase, including testing, CI/CD and software release.</li>
</ul>
<p>Bonus if you have...</p>
<ul>
<li>Experience in AI and Agentic technologies to aid software development.</li>
<li>Background in Quality Engineering.</li>
<li>Knowledge of client-server application development, and deploying code in AWS, Azure, or Google Cloud environments.</li>
<li>Experience in database and cloud technologies.</li>
<li>Experience in game or game engine development.</li>
</ul>
<p>This is a hybrid role in EA Vancouver.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$119,600 - $167,300 CAD</Salaryrange>
      <Skills>C#, C++, Software engineering lifecycle, Testing, CI/CD, Software release, AI and Agentic technologies, Quality Engineering, Client-server application development, Database and cloud technologies, Game or game engine development</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts is a leading video game developer and publisher with a portfolio of games and experiences. The company has locations around the world and employs thousands of people.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Software-Engineer/212855</Applyto>
      <Location>Vancouver</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>34127f04-9f9</externalid>
      <Title>Senior Technical Product Manager</Title>
      <Description><![CDATA[<p><strong>Senior Technical Product Manager</strong></p>
<p><strong>About the Role</strong></p>
<p>This exciting role offers a talented and experienced technical product manager the opportunity to drive innovations that improve the ability for EA&#39;s developers to make great games and products.</p>
<p>You will have the chance to work with game teams across the entire organisation, including FC, Madden, Battlefield, the Sims, Apex Legends, and central teams such as Frostbite.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Vision &amp; Roadmap: Own the product vision and roadmaps for EA internally used products across adjacent developer platform capabilities, aligned to studio and platform strategy.</li>
</ul>
<ul>
<li>Customer Success: Understand internal customer pain points (EA game developers) and translate them into prioritised, outcome-driven requirements informed by research and competitive analysis.</li>
</ul>
<ul>
<li>Aligning Priorities: Balance scalability, security, usability, and developer productivity, ensuring roadmap decisions are grounded in measurable business and developer impact.</li>
</ul>
<ul>
<li>Domain Expertise: Partner deeply with engineering leaders to understand system architecture, performance trade-offs, and scalable platform services; translate complex technical initiatives into clear execution plans.</li>
</ul>
<ul>
<li>EA&#39;s Software Development Lifecycle: Develop a strong understanding of the game development lifecycle and identify SDLC friction points to enhance developer experience.</li>
</ul>
<ul>
<li>Stakeholder Alignment: Collaborate closely with developers, design, and studio stakeholders to validate solutions, align priorities, and drive adoption through clear communication and enablement.</li>
</ul>
<ul>
<li>Measure Success: Define and track success metrics tied to adoption, efficiency, reliability, and developer satisfaction; ensure instrumentation and telemetry are embedded from the outset.</li>
</ul>
<ul>
<li>AI-Forward: Leverage AI tools operationally to improve product management efficiency and identify opportunities to reduce manual workflows and enhance developer enablement.</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>5+ years of product management experience, preferably in platform, infrastructure, or developer-facing products.</li>
</ul>
<ul>
<li>Experience defining, owning and driving product initiatives, ideally in technically oriented or platform-focused areas.</li>
</ul>
<ul>
<li>Strong technical foundation, including familiarity with APIs, microservices, cloud technologies (e.g., AWS, Azure, GCP), CI/CD pipelines, and general software development lifecycle, with the ability to confidently engage in technical discussion and understand system architecture.</li>
</ul>
<ul>
<li>Prior experience using AI tools to support day-to-day work.</li>
</ul>
<ul>
<li>Prior experience partnering closely with engineering teams to deliver reliable, scalable solutions.</li>
</ul>
<ul>
<li>Data-informed approach to decision-making, including defining success metrics and using insights to guide priorities.</li>
</ul>
<ul>
<li>Strong written and verbal communication skills.</li>
</ul>
<ul>
<li>Thrives in working with both technical and non-technical partners.</li>
</ul>
<ul>
<li>Experience driving results across product, engineering, and studio teams in multi-org or -team environments.</li>
</ul>
<p><strong>Benefits</strong></p>
<p>This is a hybrid role, with 3 days per week working from our office in Madrid.</p>
<p>Please, apply with a CV in English.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>APIs, microservices, cloud technologies, CI/CD pipelines, software development lifecycle, AI tools, data-informed decision-making, AWS, Azure, GCP, product management, technical product management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts is a leading video game developer and publisher with a portfolio of popular games and experiences.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Senior-Technical-Product-Manager/212675</Applyto>
      <Location>Madrid</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>d55c291a-1f1</externalid>
      <Title>Subtitle Translator, QCer, SDH Linguist, Template Linguist</Title>
      <Description><![CDATA[<p><strong>Join the Keywords Studios Talent Community – Subtitle Translation/Localization Experts</strong></p>
<p>We&#39;re a global network of subtitle and localization specialists, partnering with iconic developers, publishers, and content creators. If you&#39;re a talented professional with expertise in subtitling, we&#39;d love to connect with you.</p>
<p><strong>We&#39;re Looking for Freelance Subtitle Translation Opportunities</strong></p>
<p>We&#39;re always on the lookout for talented professionals to join our network of subtitle and localization specialists. If you have experience in subtitling, QC, SDH, or template linguistics, we&#39;d love to hear from you.</p>
<p><strong>Requirements</strong></p>
<ul>
<li>Native proficiency of the target language</li>
<li>Strong command of the source language</li>
<li>Experience in at least one of the following areas of the entertainment industry: localization QC, audiovisual translation and subtitling</li>
<li>Experience with subtitle editing software and web/cloud technology</li>
<li>Deep understanding of closed captioning and subtitling, and their common failures and technical challenges</li>
<li>Solid understanding of nuances of subtitle and dub translations</li>
<li>Working knowledge of cultural differences and best practices for subtitles and dub audio creation</li>
<li>University degree or equivalent professional experience in the translation field</li>
<li>Ability to quickly adapt to workflow/process changes and updates</li>
<li>Great attention to detail, organization, problem-solving, analytical and multitasking skills</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive pay</li>
<li>Work on popular titles across film, TV, streaming, games and much more</li>
<li>Early access to unreleased content</li>
<li>Flexible project volume</li>
<li>100% remote work</li>
<li>Set your own schedule</li>
<li>Constructive feedback and support</li>
</ul>
<p><strong>Please Note</strong></p>
<ul>
<li>Signing a Non-Disclosure Agreement (NDA) is required prior to starting the recruitment process</li>
<li>Due to the high volume of applications, we regret that we are only able to respond to candidates who meet the above requirements</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>freelance</Jobtype>
      <Experiencelevel>entry|mid|senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Subtitle editing software, Web/cloud technology, Closed captioning and subtitling, Subtitle and dub translations, Cultural differences and best practices, Localization QC, Audiovisual translation and subtitling, Template linguistics</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Keywords Studios</Employername>
      <Employerlogo>https://logos.yubhub.co/j.com.png</Employerlogo>
      <Employerdescription>Keywords Studios is the world&apos;s leading provider of technical and creative services for the video games and entertainment industries, with a global footprint of over 70 studios across 26 countries.</Employerdescription>
      <Employerwebsite>https://apply.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/A08655B082</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>8a0f394e-d0f</externalid>
      <Title>Expert Subtitle Translator/QCer: English to Basque</Title>
      <Description><![CDATA[<p>We are seeking a freelance translator with proven experience in subtitling for the TV and movie industry to join our global localization network. As an Expert Subtitle Translator/QCer, you will work on adapting blockbuster game franchises and ensuring accessibility for film and TV audiences.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Translate subtitles from English to Basque with high accuracy and attention to detail</li>
<li>Collaborate with our localization teams to deliver high-impact experiences through expert translation, subtitling, dubbing, voice-over, and linguistic quality assurance services</li>
<li>Work on popular titles across film, TV, streaming, games, and much more</li>
<li>Adapt to workflow/process changes and updates quickly</li>
<li>Provide constructive feedback and support to our teams</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Native language fluency of Basque</li>
<li>Extensive knowledge of English</li>
<li>University degree or equivalent professional experience in the translation field</li>
<li>Experience in at least one of the following areas of the entertainment industry: localization QC, audiovisual translation, and subtitling</li>
<li>Deep understanding of nuances of subtitle and dub translations</li>
<li>Working knowledge of cultural differences and best practices for subtitles and dub audio creation</li>
<li>Understanding of closed captioning and subtitling, and their common failures and technical challenges</li>
<li>Confidence and experience with subtitle editing software and web/cloud technology</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive pay</li>
<li>Early access to unreleased content</li>
<li>Flexible project volume</li>
<li>100% remote work</li>
<li>Set your own schedule</li>
<li>Constructive feedback and support</li>
</ul>
<p><strong>Our Diversity, Equity, Inclusion, and Belonging (DEIB) Commitment</strong></p>
<p>Keywords Studios is an Equal Opportunity Employer and considers applicants for all positions without regard to race, ethnicity, religion or belief, sex, age, national origin, marital status, sexual orientation, gender identity, disability, or any other characteristic protected by applicable laws.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>contract</Jobtype>
      <Experiencelevel></Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Basque, English, Subtitle editing software, Web/cloud technology, Localization QC, Audiovisual translation, Subtitling, Subtitle editing software, Web/cloud technology</Skills>
      <Category>Media &amp; Entertainment</Category>
      <Industry>Entertainment</Industry>
      <Employername>Keywords Studios</Employername>
      <Employerlogo>https://logos.yubhub.co/j.com.png</Employerlogo>
      <Employerdescription>Keywords Studios is a global leader in technical and creative services for the video games and entertainment industries, with over 70 studios across 26 countries.</Employerdescription>
      <Employerwebsite>https://apply.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/70B3DA7C08</Applyto>
      <Location>Basque Country, Spain</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>5de0016f-87f</externalid>
      <Title>Expert Subtitle Translator/QCer English to Tagalog | Filipino</Title>
      <Description><![CDATA[<p>We are looking for expert subtitling freelance translators with proven experience in subtitling for the TV and movie industry from English into Filipino/Tagalog. If you would like to pursue freelance translation opportunities with us, please send us your CV in English, providing detailed information regarding your subtitling experience.</p>
<p>We are currently seeking a freelance translator with the following qualifications:</p>
<p><strong>Requirements</strong></p>
<ul>
<li>Native language fluency of target language</li>
<li>Extensive knowledge of source language</li>
<li>University degree or equivalent professional experience in the translation field</li>
<li>Experience in at least one of the following areas of the entertainment industry: localization QC, audiovisual translation, and subtitling</li>
<li>Deep understanding of nuances of subtitle and dub translations</li>
<li>Working knowledge of cultural differences and best practices for subtitles and dub audio creation</li>
<li>Understanding of closed captioning and subtitling, and their common failures and technical challenges</li>
<li>Confidence and experience with subtitle editing software and web/cloud technology</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive pay</li>
<li>Work on popular titles across film, TV, streaming, games, and much more</li>
<li>Early access to unreleased content</li>
<li>Flexible project volume</li>
<li>100% remote work</li>
<li>Set your own schedule</li>
<li>Constructive feedback and support</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>contract</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>subtitle editing software, web/cloud technology, audiovisual translation, subtitling, closed captioning, subtitle editing software, web/cloud technology</Skills>
      <Category>Media &amp; Entertainment</Category>
      <Industry>Media &amp; Entertainment</Industry>
      <Employername>Keywords Studios Los Angeles</Employername>
      <Employerlogo>https://logos.yubhub.co/j.com.png</Employerlogo>
      <Employerdescription>Keywords Studios Los Angeles is a provider of multimedia content localization and audio production services, collaborating with top-tier streaming platforms, broadcasters, content creators, and publishers in the gaming and media &amp; entertainment industries.</Employerdescription>
      <Employerwebsite>https://apply.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/1B59FE5260</Applyto>
      <Location>Metro Manila, Philippines</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>d5ac48b1-503</externalid>
      <Title>Software Engineer, Compute Platform</Title>
      <Description><![CDATA[<p><strong>Location</strong></p>
<p>Foster City, CA (Hybrid) In office M,W,F</p>
<p><strong>Employment Type</strong></p>
<p>Full time</p>
<p><strong>Location Type</strong></p>
<p>Hybrid</p>
<p><strong>Department</strong></p>
<p>EngineeringPlatform</p>
<p><strong>Compensation</strong></p>
<ul>
<li>Compensation is determined based on career level, with the base salary for this role ranging from $130K – $290K • Offers Equity • Performance Based Bonus</li>
</ul>
<p>We are seeking talented distributed systems engineers who are passionate about building innovative solutions for application deployment. Your mission will be to enhance the capabilities of Replit Infrastructure, optimize performance across global regions, and drive efficiency while delivering an exceptional user experience. If you have a strong foundation in software development, a deep understanding of cloud technologies, and a track record of delivering high-quality code, we want to hear from you.</p>
<p><strong>In this role you will:</strong></p>
<ul>
<li>Expand Replit&#39;s cloud infrastructure offerings: Launch new cloud products to be used by Replit Agent to build complex apps. Collaborate with cross-functional teams to design and implement these features, empowering developers with a comprehensive suite of tools to build and deploy their applications efficiently.</li>
</ul>
<ul>
<li>Enhance reliability and scalability: Identify bottlenecks, optimize critical paths, and implement robust monitoring and alerting systems. Work closely with the SRE team to ensure high availability and minimal downtime. Enable our customers to seamlessly scale their applications to meet the demands of their growing user base.</li>
</ul>
<ul>
<li>Improve utilization of cloud infrastructure: Analyze our infrastructure costs and identify opportunities for optimization. Implement strategies to reduce cloud expenses without compromising performance or reliability. This could involve techniques such as resource provisioning, auto-scaling, cost-aware scheduling, and data lifecycle management. Your efforts will directly contribute to the financial efficiency of our cloud services.</li>
</ul>
<p><strong>Required skills and experience:</strong></p>
<ul>
<li>Distributed systems: Track record of working with platform-as-a-service, distributed storage, or information retrieval systems. Experience in designing scalable architectures and optimizing systems for latency or cost.</li>
</ul>
<ul>
<li>Problem-solving mindset: Ability to approach complex challenges pragmatically and devise effective solutions. You think radically but ship incrementally.</li>
</ul>
<ul>
<li>Self-directed and autonomous: Able to work independently, set priorities, and drive projects forward. You take ownership and initiative.</li>
</ul>
<ul>
<li>Versatility and flexibility: Able to wear multiple hats and tackle a wide range of challenges. You are comfortable working across different layers of the stack and adapting to the needs of the project.</li>
</ul>
<ul>
<li>Continuous learning and adaptability: Passionate about staying up-to-date with industry trends and expanding your skill set. You embrace change and adapt quickly.</li>
</ul>
<p><strong>Nice to have:</strong></p>
<ul>
<li>Experience working on cloud infrastructure or platform products, particularly in the areas of application deployment, serverless computing, or container orchestration.</li>
</ul>
<ul>
<li>Familiarity with Google Cloud Platform (GCP) services and tools, such as GCE, GKE,, Cloud Run, or Cloud Storage.</li>
</ul>
<ul>
<li>Contributions to open-source projects related to cloud technologies, deployment frameworks, or developer tools. We love OSS!</li>
</ul>
<p><strong>Tools + Tech Stack for this role</strong></p>
<ul>
<li>Golang, Rust</li>
</ul>
<p><strong>This role may not be a fit if</strong></p>
<ul>
<li>You are a generalist backend engineer who hasn’t built scalable distributed systems.</li>
</ul>
<ul>
<li>You cannot take part in the oncall rotation of min 6 people.</li>
</ul>
<ul>
<li>You do not enjoy diving into Linux internals.</li>
</ul>
<p>_This is a full-time role that can be held from our Foster City, CA office. The hybrid role has an in-office requirement of Monday, Wednesday, and Friday._</p>
<p><strong>Full-Time Employee Benefits Include:</strong></p>
<p>💰 Competitive Salary &amp; Equity</p>
<p>💹 401(k) Program with a 4% match</p>
<p>⚕️ Health, Dental, Vision and Life Insurance</p>
<p>🩼 Short Term and Long Term Disability</p>
<p>🚼 Paid Parental, Medical, Caregiver Leave</p>
<p>🚗 Commuter Benefits</p>
<p>📱 Monthly Wellness Stipend</p>
<p>🧑‍💻 Autonomous Work Environment</p>
<p>🖥 In Office Set-Up Reimbursement</p>
<p>🏝 Flexible Time Off (FTO) + Holidays</p>
<p>🚀 Quarterly Team Gatherings</p>
<p>☕ In Office Amenities</p>
<p><strong>Want to learn more about what we are up to?</strong></p>
<ul>
<li>Meet the Replit Agent</li>
</ul>
<ul>
<li>Replit: Make an app for that</li>
</ul>
<ul>
<li>Replit Blog</li>
</ul>
<ul>
<li>Amjad TED Talk</li>
</ul>
<p><strong>Interviewing + Culture at Replit</strong></p>
<ul>
<li>Operating Principles</li>
</ul>
<ul>
<li>Reasons not to work at Replit</li>
</ul>
<p>To achieve our mission of making programming more accessible around the world, we need our team to be representative of the world. We welcome your unique perspective and experiences in shaping this product. We encourage people from all kinds of backgrounds to apply, including and especially candidates from underrepresented and non-traditional backgrounds.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$130K – $290K</Salaryrange>
      <Skills>Distributed systems, Problem-solving mindset, Self-directed and autonomous, Versatility and flexibility, Continuous learning and adaptability, Experience working on cloud infrastructure or platform products, Familiarity with Google Cloud Platform (GCP) services and tools, Contributions to open-source projects related to cloud technologies, deployment frameworks, or developer tools</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Replit</Employername>
      <Employerlogo>https://logos.yubhub.co/replit.com.png</Employerlogo>
      <Employerdescription>Replit is a software creation platform that enables anyone to build applications using natural language. With millions of users worldwide, Replit is a leading provider of software development tools.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/replit/659a8e1e-69ba-44c0-a632-96665051a3e8</Applyto>
      <Location>Foster City, CA</Location>
      <Country></Country>
      <Postedate>2026-03-07</Postedate>
    </job>
    <job>
      <externalid>939b991c-cd9</externalid>
      <Title>Strategic Finance, Compute Lead</Title>
      <Description><![CDATA[<p><strong>Location</strong></p>
<p>San Francisco</p>
<p><strong>Employment Type</strong></p>
<p>Full time</p>
<p><strong>Department</strong></p>
<p>Strategic Finance</p>
<p><strong>Compensation</strong></p>
<ul>
<li>$185K – $260K • Offers Equity</li>
</ul>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
</ul>
<ul>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
</ul>
<ul>
<li>401(k) retirement plan with employer match</li>
</ul>
<ul>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
</ul>
<ul>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
</ul>
<ul>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
</ul>
<ul>
<li>Mental health and wellness support</li>
</ul>
<ul>
<li>Employer-paid basic life and disability coverage</li>
</ul>
<ul>
<li>Annual learning and development stipend to fuel your professional growth</li>
</ul>
<ul>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
</ul>
<ul>
<li>Relocation support for eligible employees</li>
</ul>
<ul>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p>More details about our benefits are available to candidates during the hiring process.</p>
<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>
<p><strong>About the Team</strong></p>
<p>The Compute &amp; Infrastructure Strategy team handles strategy and execution of OpenAI’s compute roadmap. This team’s key responsibilities span financial analysis &amp; reporting, capacity planning, commercial and business development, and strategic partnerships. We partner across the business to allocate and deploy our resources for the highest impact outcomes.</p>
<p><strong>About the Role</strong></p>
<p>Compute is a key lever for OpenAI and AI progress. We are seeking a Strategic Finance Compute Lead to provide finance leadership for our compute and infrastructure spend and play a significant role in shaping our long-term compute strategy. You will play a critical role developing financial models for all areas of compute, analyzing spend patterns, and providing critical insights to optimize and plan for our future compute needs. This role will be a key partner to our scaling and supercomputing engineering teams providing financial expertise and guidance to optimize our capacity investments and drive strategic decision-making, while collaborating with other members of the finance organization to align our compute strategy with broader financial considerations.</p>
<p>This role is based in San Francisco, CA. We use a hybrid work model of 3 days in the office per week and offer relocation assistance to new employees.</p>
<p><strong>In this role, you will:</strong></p>
<ul>
<li>Own and develop financial models across different elements of compute (GPUs, CPUs, storage and networking)</li>
<li>Lead strategic financial analysis for long-term capacity initiatives, working closely with scaling and supercomputing engineering teams</li>
<li>Maintain deep expertise on compute contract terms, pricing structures and optimization opportunities</li>
<li>Serve as a partner to FP&amp;A and strategic finance teams, aligning compute and infrastructure with broader financial and business strategies</li>
<li>Create high-quality Exec and Board-facing presentations</li>
<li>Stay abreast of market trends and competitive dynamics to inform and improve our infrastructure strategy</li>
</ul>
<p><strong>You might thrive in this role if you have:</strong></p>
<ul>
<li>5+ years of experience across strategic finance, private / growth equity, investment banking, strategy &amp; operations, and/or business development with 3+ years of finance operating experience at a high-growth technology company</li>
<li>Experience partnering with engineering and product teams to provide financial analysis and insights to critical strategic decisions</li>
<li>Good understanding of cloud technology and compute infrastructure</li>
<li>Exceptionally strong analytical, financial modeling, and written and oral communication skills</li>
<li>Demonstrated track record of thoughtful investment decisions</li>
<li>Experience driving operational outcomes under ambitious deadlines</li>
<li>Exceptionally strong relationship building, business judgment, and communication skills</li>
<li>Bachelor’s degree or equivalent practical experience</li>
</ul>
<p><strong>About OpenAI</strong></p>
<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full spectrum of humanity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$185K – $260K</Salaryrange>
      <Skills>strategic finance, private / growth equity, investment banking, strategy &amp; operations, business development, financial modeling, cloud technology, compute infrastructure, relationship building, business judgment, communication skills, data analysis, data visualization, financial planning, budgeting, forecasting, financial reporting, accounting, auditing, taxation, financial regulations</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. The company was founded in 2015 and has since grown to become a leading player in the field of artificial intelligence.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/b4196174-9cc3-487d-9d21-848bc283b80f</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>c78037b4-921</externalid>
      <Title>Senior Software Engineer - Backend</Title>
      <Description><![CDATA[<p><strong>Summary</strong></p>
<p>Microsoft AI are looking for a talented Senior Software Engineer - Backend at their Vancouver office. This role sits at the heart of strategic decision-making, turning market data into actionable insights for a company that&#39;s revolutionising the sports world with data. You&#39;ll work directly with leadership to shape the company&#39;s direction in the sports data engineering team.</p>
<p><strong>About the Role</strong></p>
<p>The Microsoft Sports Data Engineering team within Microsoft AI is seeking a Senior Software Engineer responsible for designing data ingestion platforms and services, upholding reliable data management standards, and developing and delivering data-driven solutions. These efforts collectively support the creation of advanced, innovative sports experiences. As a Senior Software Engineer, you will provide leadership and architectural guidance in designing and maintaining robust, scalable, and efficient data ingestion pipelines and data services. You will deliver high-quality, thoroughly tested, secure, and maintainable code. You will proactively generate ideas and contribute to the continuous improvement of the technology stack, tools, and development processes. You will collaborate with cross-functional teams to effectively address business requirements while upholding engineering standards and reducing technical debt. You will diagnose and resolve issues arising in both production and development environments. You will research, evaluate, and experiment with innovative technologies to enhance system reliability, efficiency, and consistency. You will embody our culture and values.</p>
<p><strong>Accountabilities</strong></p>
<ul>
<li>Provide leadership and architectural guidance in designing and maintaining robust, scalable, and efficient data ingestion pipelines and data services.</li>
<li>Deliver high-quality, thoroughly tested, secure, and maintainable code.</li>
<li>Proactively generate ideas and contribute to the continuous improvement of the technology stack, tools, and development processes.</li>
</ul>
<p><strong>The Candidate we&#39;re looking for</strong></p>
<p><strong>Experience:</strong></p>
<ul>
<li>4+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python.</li>
</ul>
<p><strong>Technical skills:</strong></p>
<ul>
<li>Proficient foundation in data structures, algorithms with demonstrated testing, debugging and analytical skills.</li>
</ul>
<p><strong>Personal attributes:</strong></p>
<ul>
<li>Excellent communication and collaboration skills.</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary.</li>
<li>Comprehensive benefits package.</li>
<li>Opportunities for professional growth and development.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>CAD $114,400 - CAD $203,900 per year</Salaryrange>
      <Skills>C, C++, C#, Java, JavaScript, Python, data structures, algorithms, testing, debugging, analytical skills, AWS, Azure, Google cloud technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Microsoft AI</Employername>
      <Employerlogo>https://logos.yubhub.co/microsoft.ai.png</Employerlogo>
      <Employerdescription>Microsoft is a multinational technology company that develops, manufactures, licenses, and supports a wide range of software products, services, and devices. The company is known for its operating systems, productivity software, and cloud computing services. Microsoft is a leader in the technology industry and has a strong presence in the global market.</Employerdescription>
      <Employerwebsite>https://microsoft.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://microsoft.ai/job/senior-software-engineer-backend-2/</Applyto>
      <Location>Vancouver</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>80c32163-2d2</externalid>
      <Title>AI Engineer - Player Intelligence and Growth, Data &amp; Insights (D&amp;I)</Title>
      <Description><![CDATA[<p>We are hiring an AI Engineer to join the Player Intelligence &amp; Growth team within Data and Insights (D&amp;I), reporting to a Sr Manager. This team partners with all of EA&#39;s game studios to offer data science &amp; AI products and solutions. For this AI Engineer role we are looking for applied and practical AI/ML expertise with a focus on Gen AI Solutions.</p>
<p><strong>What you&#39;ll do</strong></p>
<p>Work directly with game teams/partners (internal clients) to understand their offerings/domain and create AI products and solutions to solve for their use cases.</p>
<p><strong>What you need</strong></p>
<ul>
<li>Graduate degree in Computer Science, Engineering, AI/ML, or a related quantitative field encouraged.</li>
<li>4+ years of experience building AI, ML, or data-driven systems in production environments.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$119,600 - $167,300 CAD</Salaryrange>
      <Skills>AI/ML expertise, Gen AI Solutions, Python, SQL, cloud technologies, experience working with LLMs, embeddings, retrieval systems, AI agents</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/AI-Engineer-Player-Intelligence-and-Growth-Data-Insights-D-I/211264</Applyto>
      <Location>Vancouver</Location>
      <Country></Country>
      <Postedate>2026-03-03</Postedate>
    </job>
    <job>
      <externalid>070bcade-38b</externalid>
      <Title>Software Engineer II</Title>
      <Description><![CDATA[<p>This role is responsible for performing stress and load testing, creating and maintaining test code, executing tests, and providing accurate analysis and reporting of test results. The ideal candidate will have expertise in object-oriented design/programming, strong programming expertise in either C/C++, or Java, and knowledge of scalable platform architecture, end-to-end systems design and architecture.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Perform stress and load testing</li>
<li>Create, maintain, and improve test code in C++/Java/Scala to support applications and features</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>Bachelor’s degree in computer science or higher</li>
<li>3-6 years of relative experience</li>
<li>Expertise in object-oriented design/programming, strong programming expertise in either C/C++, or Java</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>object-oriented design/programming, C/C++ or Java, scalable platform architecture, visualization &amp; monitoring tools, database technologies, cloud technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts is a leading video game developer and publisher that creates next-level entertainment experiences that inspire players and fans around the world. The company is known for its commitment to innovation, creativity, and community engagement.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Software-Engineer-II/212827</Applyto>
      <Location>Hyderabad</Location>
      <Country></Country>
      <Postedate>2026-02-20</Postedate>
    </job>
    <job>
      <externalid>919fb3e7-c3b</externalid>
      <Title>Technical Artist 2 (AI/ML)</Title>
      <Description><![CDATA[<p>As a Technical Artist, you will collaborate closely with our art and technical art teams to design and implement AI/ML-assisted, high-quality content production workflows. You will bridge the gap between creativity and technology, helping to develop tools and pipelines that reimage workflows to empower artists to focus on their craft.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Prototype and implement tools that leverage AI/ML to automate or simplify complex art workflows, reducing repetitive tasks and unlocking new creative possibilities.</li>
<li>Integrate DCC applications (e.g., Maya, Photoshop, Substance) into the Frostbite engine, ensuring seamless asset transfer, manipulation, and scalability.</li>
<li>Build intuitive UIs (Python/Web) that make advanced tools accessible to non-technical users; conduct user research and feedback sessions to iterate on functionality and usability.</li>
<li>Explore and apply Large Language Models and other generative AI technologies to accelerate content generation and augment artist workflows.</li>
<li>Partner with engineers, artists, and researchers to prototype and deliver innovative solutions that enhance both technical pipelines and creative processes.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>Bachelor&#39;s degree (or equivalent experience) in a related field.</li>
<li>4+ years of Python development experience.</li>
<li>Proven experience building art pipelines for production.</li>
<li>Experience designing and developing UX + UI (Python/Web).</li>
<li>Familiarity with machine learning pipelines, LLMs, and MCP frameworks.</li>
<li>Working knowledge of cloud technologies and web APIs.</li>
<li>Working Knowledge of DCC applications such as Maya, Photoshop, and Substance.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>temporary</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$100,000 - $139,500 CAD</Salaryrange>
      <Skills>Python development experience, Experience building art pipelines for production, Experience designing and developing UX + UI (Python/Web), Familiarity with machine learning pipelines, LLMs, and MCP frameworks, Working knowledge of cloud technologies and web APIs, Working Knowledge of DCC applications such as Maya, Photoshop, and Substance</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Technical-Artist-2-AI-ML/212034</Applyto>
      <Location>Vancouver</Location>
      <Country></Country>
      <Postedate>2026-01-21</Postedate>
    </job>
    <job>
      <externalid>1bdd335f-927</externalid>
      <Title>Praktikum IT-Produktlösungen Fahrzeugentwicklung</Title>
      <Description><![CDATA[<p><strong>What you&#39;ll do</strong></p>
<p>You&#39;ll be part of our team developing internal applications for various departments. Your focus will be on complex IT system landscapes for vehicle development, optimizing and automating development processes, agile software development, and more.</p>
<ul>
<li>Komplexe IT-Systemlandschaften für die Fahrzeugentwicklung</li>
<li>Optimierung und Automatisierung von Entwicklungsprozessen</li>
<li>Agile Softwareentwicklung</li>
<li>Anforderungsanalyse, Konzeption, Entwicklung und Weiterentwicklung von IT-Produktlösungen</li>
<li>CI/CD-Pipelines und moderne DevOps-Ansätze</li>
<li>Cloud-Technologien &amp; containerbasierte Umgebungen (z.B. Docker)</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>Ein Studienplatz für (Wirtschafts-)Informatik oder eines vergleichbaren Studiengangs (mindestens im 3. Semester oder im Gap-Year zwischen Bachelor und Master)</li>
<li>Erste praktische Erfahrung und hohe Motivation im Bereich Softwareentwicklung</li>
<li>Ein ausgeprägtes Interesse an IT-Themen und modernen Softwarearchitekturen</li>
<li>Eine strukturierte und selbstorganisierte Arbeitsweise sowie ein gutes Organisationstalent</li>
<li>Eine ausgeprägte Service- und Lösungsorientierung im Umgang mit internen Kunden und Projektpartnern</li>
</ul>
<p><strong>Why this matters</strong></p>
<p>As a member of our team, you&#39;ll have the opportunity to work on complex IT system landscapes, optimize and automate development processes, and develop internal applications for various departments. You&#39;ll be part of a dynamic and innovative team that&#39;s passionate about technology and collaboration.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>internship</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Software development, IT system landscapes, Agile software development, DevOps, Cloud technologies, Docker, CI/CD pipelines</Skills>
      <Category>IT</Category>
      <Industry>Technology</Industry>
      <Employername>Dr. Ing. h.c. F. Porsche AG</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.porsche.com.png</Employerlogo>
      <Employerdescription>As a team, we have fun with technology – and with collaboration. Together, we develop internal applications for various departments that make a real difference: Our solutions contribute to establishing process stability in vehicle development and after-sales systems and significantly increasing the efficiency of complex processes.</Employerdescription>
      <Employerwebsite>https://jobs.porsche.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.porsche.com/index.php?ac=jobad&amp;id=17963</Applyto>
      <Location>Weissach</Location>
      <Country></Country>
      <Postedate>2025-12-08</Postedate>
    </job>
  </jobs>
</source>