<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>9db40e95-6c3</externalid>
      <Title>Manager Database Integration</Title>
      <Description><![CDATA[<p>The Global Group Lead/Unit Manager for Database and Integration Operations serves as a key leader, driving the efficient operation and continuous improvement of the IT Database and Integration Operations Unit in alignment with the Global (20%) and Regional/AHM (80%) Digital Unit/IT Business Unit objectives.</p>
<p>This role is pivotal in setting direction, orchestrating planning, and providing decisive administrative leadership for daily operations.</p>
<p>As a leader, the manager oversees both operational and analytical database platforms,including data warehouses, data marts, and data lakes,and ensures robust management of platforms supporting APIs, service buses, and messaging systems.</p>
<p>The role demands proactive collaboration with decision-makers across the organization to champion, develop, and implement cost-effective technology solutions and services.</p>
<p>Leadership extends to defining and enforcing IT and Data Platform Services strategies, standards, policies, procedures, and best practices, ensuring the unit’s services and applications are aligned with organizational goals.</p>
<p>The manager represents the unit in key meetings and workshops, often traveling domestically and internationally to foster global initiatives and cross-regional collaboration.</p>
<p>A core aspect of this leadership role is preparing and delivering impactful presentations on strategy, roadmap, and QCD (Quality, Cost, Delivery) performance to stakeholders at all levels.</p>
<p>The manager is deeply committed to coaching and mentoring team members, cultivating a culture of engagement, and driving improvements in people engagement scores throughout the unit.</p>
<p>Key accountabilities include collaborating with the Department Manager to shape the direction of the Global Digital Unit/AHM IT, Division, and Department; developing and presenting strategies, roadmaps, and QCD performance to stakeholders; developing and sustaining high-performing IT teams; meeting annually approved project goals for the Enterprise Platform Services Unit; and ensuring unit operations meet platform and service availability targets, comply with governance standards, and stay updated on tools to manage risks and maintain performance.</p>
<p>Qualifications, experience, and skills required include a Bachelor’s Degree in Information Science and/or equivalent work experience, 12+ years of IT business work experience, prior experience in Database Administration or Data Architecture or Data Engineering, and 3+ years leadership experience.</p>
<p>Additional skills include professional writing skills, experience in management-level presentations, experience in developing and managing a budget, and experience in managing vendors.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$129,000.00 - $161,200.00</Salaryrange>
      <Skills>Database Administration, Data Architecture, Data Engineering, Leadership, Communication, Strategic Planning, Project Management, Team Management, Budgeting, Vendor Management</Skills>
      <Category>IT</Category>
      <Industry>Automotive</Industry>
      <Employername>Honda</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.honda.com.png</Employerlogo>
      <Employerdescription>Honda is a multinational corporation that produces automobiles, motorcycles, and power equipment. It is one of the largest automobile manufacturers in the world.</Employerdescription>
      <Employerwebsite>https://careers.honda.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.honda.com/us/en/job/9885/Manager-Database-Integration</Applyto>
      <Location>Torrance</Location>
      <Country></Country>
      <Postedate>2026-04-22</Postedate>
    </job>
    <job>
      <externalid>db261609-388</externalid>
      <Title>Principal Data &amp; Ontology Architect - AI Enablement</Title>
      <Description><![CDATA[<p>We are looking for a Principal Data &amp; Ontology Architect to support the implementation and adoption of data and ontology enablement practices and standards within Control Tower Operations to support scalable, governed, and business-aligned AI initiatives.</p>
<p>The successful candidate will serve as the primary bridge between Business Units, Global IT, and Control Tower Operations, ensuring shared understanding of data practices, workflows, and requirements. They will apply established standards for semantic modeling, domain alignment, concept reuse, and ontology lifecycle management.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Supporting the implementation and ongoing maintenance of ontology enablement practices and operating model strategy to support AI, analytics, and digital initiatives across multiple Business Units</li>
<li>Applying established standards for semantic modeling, domain alignment, concept reuse, and ontology lifecycle management</li>
<li>Serving as the enterprise subject-matter authority for ontology-related topics, providing recommendations and guidance to governance and leadership forums</li>
<li>Collaborating with Global IT and enterprise data architecture to ensure ontology practices align with enterprise data platforms and Control Tower operational processes</li>
</ul>
<ul>
<li>Partnering with Business Units to understand domain concepts, terminology, operational data, and AI use cases, translating them into ontology-aligned data structures</li>
<li>Guiding Business Units in contributing domain models, metadata, and data assets into the enterprise ontology using defined governance and intake processes</li>
<li>Enabling repeatable onboarding of Business Unit data into AI initiatives, reducing reliance on ad-hoc IT engagement and minimizing duplicated effort</li>
</ul>
<ul>
<li>Serving as a liaison between Business Units and Global IT for AI data and ontology-related matters</li>
<li>Engaging with Global IT teams to understand enterprise data platforms, workflows, standards, and operational constraints</li>
<li>Translating Global IT practices, requirements, and workflows into clear, actionable guidance for Business Unit data stewards</li>
</ul>
<ul>
<li>Educating, guiding, and supporting Business Unit data stewards on their roles in data governance, ontology contribution, and AI data enablement</li>
<li>Supporting the development and documentation of workflows, expectations, and operating models for how BU data stewards engage with the Control Tower and Global IT</li>
</ul>
<ul>
<li>Ensuring Business Unit Data Stewards understand how to prepare, govern, and submit data assets for ontology integration and AI use</li>
<li>Promoting consistent adoption of governance, quality, and semantic standards across Business Units</li>
</ul>
<ul>
<li>Supporting integration of data and ontology enablement into Control Tower workflows</li>
<li>Providing operational insight into data readiness, semantic risks, and governance gaps to inform Control Tower decision-making</li>
<li>Identifying systemic issues and contributing recommendations to drive continuous improvement of data enablement processes</li>
</ul>
<ul>
<li>Ensuring semantic integrity, data quality, lineage, and consistency are maintained as data assets flow into AI solutions</li>
<li>Identifying systemic issues and recommending continuous improvement opportunities to Control Tower Operations leadership</li>
<li>Influencing corrective actions, tooling investments, or governance updates to mitigate long-term risk</li>
</ul>
<p>This role requires a minimum of 10 years of relevant work experience in data architecture, data governance, ontology development, semantic modeling, or related disciplines, supporting cross-functional initiatives spanning multiple business units and IT organizations.</p>
<p>The ideal candidate will have in-depth expertise in ontology design, semantic modeling, and domain-driven data architecture, as well as experience contributing to the development and implementation of data and ontology strategies. They will also have demonstrated experience serving as a bridge between business stakeholders and IT organizations, with a strong ability to translate technical platforms, workflows, and constraints into business-understandable guidance.</p>
<p>A Bachelor&#39;s level degree or diploma in Computer Science, Data Science/Engineering, Applied Mathematics/Statistics, Electronics/Electrical, Information Technology/Information Sciences, or a related field of study is required. A Master&#39;s or Ph.D. degree is preferred.</p>
<p>The successful candidate will be comfortable operating in ambiguous, evolving environments with enterprise-level impact, and will have a systems-thinking mindset with understanding of AI, analytics, and enterprise data platforms.</p>
<p>Highly desirable skills include proficiency in OWL (Web Ontology Language), RDF/RDFS – graph-based data model, storage in graph databases such as Neo4j or Amazon Neptune, and querying using SPARQL for RDF-based ontologies.</p>
<p>This is an onsite job based at our ADC, Raymond, OH office. One telecommuting workday per week may be possible with prior departmental approval.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$120,400.00 - $150,500.00</Salaryrange>
      <Skills>ontology design, semantic modeling, domain-driven data architecture, data governance, AI data enablement, data quality, lineage, consistency, OWL (Web Ontology Language), RDF/RDFS – graph-based data model, graph databases, Neo4j, Amazon Neptune, SPARQL, ontology development, data architecture, data science, electronics, electrical, information technology, information sciences</Skills>
      <Category>Engineering</Category>
      <Industry>Automotive</Industry>
      <Employername>Honda</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.honda.com.png</Employerlogo>
      <Employerdescription>Honda is a multinational Japanese conglomerate that produces automobiles, motorcycles, and power equipment. It is one of the largest automobile manufacturers in the world.</Employerdescription>
      <Employerwebsite>https://careers.honda.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.honda.com/us/en/job/10812/Principal-Data-Ontology-Architect-AI-Enablement</Applyto>
      <Location>Raymond</Location>
      <Country></Country>
      <Postedate>2026-04-22</Postedate>
    </job>
    <job>
      <externalid>5b244f27-9fd</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases. You will work with engagement managers to scope variety of professional services work with input from the customer.</p>
<p>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications. Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</p>
<p>Provide an escalated level of support for customer operational issues. You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</p>
<p>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</p>
<p>The ideal candidate will have 6+ years experience in data engineering, data platforms &amp; analytics, comfortable writing code in either Python or Scala, working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one, deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals, familiarity with CI/CD for production deployments, working knowledge of MLOps, design and deployment of performant end-to-end data architectures, experience with technical project delivery - managing scope and timelines, documentation and white-boarding skills, experience working with clients and managing conflicts, build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</p>
<p>Travel to customers 20% of the time.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461258002</Applyto>
      <Location>Raleigh, North Carolina</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>63a79841-36e</externalid>
      <Title>Solutions Architect (Vietnam)</Title>
      <Description><![CDATA[<p>At Databricks, we&#39;re seeking a Solutions Architect to join our Field Engineering team in Vietnam. As a key member of our team, you will work closely with customers to understand their complex data challenges and provide technical expertise to demonstrate how our Data Intelligence Platform can help them solve these issues.</p>
<p>You will form successful relationships with clients throughout Vietnam, providing technical and business value to Databricks customers in collaboration with Account Executives. You will operate as an expert in big data analytics, developing into a &#39;champion&#39; and trusted advisor on multiple issues of architecture, design, and implementation to lead to the successful adoption of the Databricks Data Intelligence Platform.</p>
<p>Your responsibilities will include:</p>
<ul>
<li>Developing customer relationships and building internal partnerships with account executives and teams</li>
<li>Engaging customers in technical sales, challenging their questions, guiding clear outcomes, and communicating technical and value propositions</li>
<li>Prior experience with coding in a core programming language (i.e., Python, Java, Scala) and willingness to learn a base level of Spark</li>
<li>Proficient with Big Data Analytics technologies, including hands-on expertise with complex proofs-of-concept and public cloud platform(s)</li>
<li>Experienced in use case discovery, scoping, and delivering complex solution architecture designs to multiple audiences requiring an ability to context switch in levels of technical depth</li>
<li>Proficiency in the Vietnamese language is required as this role serves clients based in Vietnam and involves direct customer communications in the Vietnamese language</li>
</ul>
<p>In return, you will have the opportunity to grow your knowledge and expertise to the level of a technical and/or industry specialist, and contribute to the success of our customers and the growth of our organization.</p>
<p>If you&#39;re passionate about working with data and AI, and want to make a real impact, we encourage you to apply for this exciting opportunity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, Java, Scala, Big Data Analytics, Spark, Cloud Computing, Data Science, Machine Learning, Data Engineering, Data Architecture, Cloud Security, DevOps</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data science and analytics. Over 10,000 organizations worldwide rely on its Data Intelligence Platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8472732002</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3a2daa2d-9ff</externalid>
      <Title>Manager, Field Engineering</Title>
      <Description><![CDATA[<p>As a Manager, Field Engineering (Solutions Architects), you will build and lead a team of pre-sales Solutions Architects focusing on your assigned accounts. Your experience partnering with the sales organisation will help close revenue with the right approach whilst coaching new sales and pre-sales team members to work together.</p>
<p>You will guide and get involved to enhance your team&#39;s effectiveness; be an expert at communicating complex, business value-focused solutions; support complex sales cycles; and build relationships with key stakeholders in your customers&#39; companies.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Managing hiring, building the pre-sales team of Solutions Architects</li>
<li>Rapidly scaling the designated Field Engineering segment organisation without sacrificing quality</li>
<li>Building a collaborative culture within a rapid-growth team</li>
</ul>
<p>To embody and promote Databricks&#39; customer-obsessed, teamwork, and diverse culture</p>
<ul>
<li>Supporting increase Return on investment of SA involvement in sales cycles by 2-3x over 18 months</li>
<li>Promoting a solution and value-based selling field-engineering organisation</li>
<li>Displaying an understanding of business needs and revenue potential for accounts in the assigned region</li>
<li>Building Databricks&#39; brand in partnership with the Marketing and Sales team</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Big Data, Cloud, SaaS, Data Architecture, Data Engineering, Database technologies, Data Science, Digital Native companies/ecosystems, AI, Cloud software models</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. Over 10,000 organisations worldwide, including Comcast, Condé Nast, and Grammarly, rely on Databricks.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8316258002</Applyto>
      <Location>Melbourne, Australia</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>fdc6f0f9-900</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, distributed computing, CI/CD, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461168002</Applyto>
      <Location>Los Angeles, California</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>477d343e-e37</externalid>
      <Title>Customer Success Architect</Title>
      <Description><![CDATA[<p>About Mixpanel</p>
<p>Mixpanel turns data clarity into innovation. Trusted by more than 29,000 companies, including Workday, Pinterest, LG, and Rakuten Viber, Mixpanel’s AI-first digital analytics help teams accelerate adoption, improve retention, and ship with confidence. Powering this is an industry-leading platform that combines product and web analytics, session replay, experimentation, feature flags, and metric trees.</p>
<p>About the Customer Success Team:</p>
<p>Mixpanel’s Customer Success &amp; Solutions Engineering teams are analytics consultants who embed themselves within our enterprise customer teams to drive our customers’ business outcomes. We work with prospects and customers throughout the customer journey to understand what drives value and serve as the technical counterpart to our Sales organization to deliver on that value.</p>
<p>You will partner closely with Account Executives, Account Managers, Product, Engineering, and Support to successfully roll out self-serve analytics within our customers’ organizations, help the customer manage change, execute on technical projects and services that delight our customers, and ultimately drive ROI on the customer’s Mixpanel investment.</p>
<p>About the Role:</p>
<p>As a CSA, you will partner with customers throughout the customer journey to understand what drives value, beginning from the pre-sales running proof of concepts to demonstrate quick time to value, to post-sales onboarding and implementation, where you set customers up for long-term success with scalable implementation and data governance best practices. Throughout the entire customer lifecycle, you will work to understand how analytics can drive business value for your customers and will consult them on how to maximize the value of Mixpanel, including managing change during Mixpanel’s rollout, defining and achieving ROI, and identifying areas of improvement in their current usage of analytics.</p>
<p>For large enterprise customers, post onboarding, you will also continue alongside the Account Managers to drive data trust and product adoption for 100+ end user teams through a change management rollout approach.</p>
<p>Responsibilities:</p>
<p>Serve as a trusted technical advisor for prospects/customers to provide strategic consultation on data architecture, governance, instrumentation, and business outcomes</p>
<p>Effectively communicate at most levels of the customer’s organization to influence business outcomes via Mixpanel, design and execute a comprehensive analytics strategy, and unblock technical and organizational roadblocks</p>
<p>Own the customer’s success with Mixpanel , documenting and delivering ROI to the customer throughout their journey to transform their business with self-serve analytics</p>
<p>Own onboarding and data health for your assigned customers/projects, including ongoing enhancements to their data quality and overall tech stack integration</p>
<p>Engage with customers’ engineering, product management, and marketing teams to handle technical onboarding, optimize Mixpanel deployments, and improve data trust</p>
<p>Deliver a variety of technical services ranging from data architecture consultations to adoption and change management best practices</p>
<p>Leverage modern data architecture expertise to create scalable data governance practices and data trust for our customers, including data optimization and re-implementation projects</p>
<p>Successfully execute on success outcomes whilst balancing project timelines, scope creep, and unanticipated issues</p>
<p>Bridge the technical-business gap with your customers , working with business stakeholders to define a strategic vision for Mixpanel and then working with the right business and technical contacts to execute that vision</p>
<p>Collaborate with our technical and solutions partners as needed on data optimization and onboarding projects</p>
<p>Be a technical sponsor for internal engagements with Mixpanel product and engineering teams to prioritize product and systems tasks from clients</p>
<p>We&#39;re Looking For Someone Who Has</p>
<p>3 to 5 years of experience consulting on defining and delivering ROI through new tool implementations</p>
<p>Experience working with Director-level members of the customer organization to define a strategic vision and successfully leveraging those members to deliver on that vision</p>
<p>The ability to communicate with stakeholders at most levels of an organization , from talking with developers about the ins and outs of an API to talking to a Director of Data Science/Product Management about organizational efficiency</p>
<p>Can manage complex projects with assorted client stakeholders, working across teams and departments to execute real change</p>
<p>Has a demonstrated successful record of experience in customer success, client-facing professional services, consulting, or technical project management role</p>
<p>Excellent written, analytical, and communication skills</p>
<p>Strong process and/or project delivery discipline</p>
<p>Eager to learn new technologies and adapt to evolving customer needs</p>
<p>We&#39;d Be Extra Excited For Someone Who Has</p>
<p>Experience in data querying, modeling, and transforming in at least one core tool, including SQL / dbt / Python / Business Intelligence tools / Product Analytics tools, etc.</p>
<p>Familiar with databases and cloud data warehouses like Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, etc.</p>
<p>Familiar with product analytics implementation methods like SDKs, Customer Data Platforms (CDPs), Event Streaming, Reverse ETL, etc.</p>
<p>Familiar with analytics best practices across business segments and verticals</p>
<p>Benefits and Perks</p>
<p>Comprehensive Medical, Vision, and Dental Care</p>
<p>Mental Wellness Benefit</p>
<p>Generous Vacation Policy &amp; Additional Company Holidays</p>
<p>Enhanced Parental Leave</p>
<p>Volunteer Time Off</p>
<p>Additional US Benefits: Pre-Tax Benefits including 401(K), Wellness Benefit, Holiday Break</p>
<p>Culture Values</p>
<p>Make Bold Bets: We choose courageous action over comfortable progress.</p>
<p>Innovate with Insight: We tackle decisions with rigor and judgment - combining data, experience and collective wisdom to drive powerful outcomes.</p>
<p>One Team: We collaborate across boundaries to achieve far greater impact than any of us could accomplish alone.</p>
<p>Candor with Connection: We build meaningful relationships that enable honest feedback and direct conversations.</p>
<p>Champion the Customer: We seek to deeply understand our customers’ needs, ensuring their success is our north star.</p>
<p>Powerful Simplicity: We find elegant solutions to complex problems, making sophisticated things accessible.</p>
<p>Why choose Mixpanel?</p>
<p>We’re a leader in analytics with over 9,000 customers and $277M raised from prominent investors: like Andreessen-Horowitz, Sequoia, YC, and, most recently, Bain Capital.</p>
<p>Mixpanel’s pioneering event-based data analytics platform offers a powerful yet simple solution for companies to understand user behaviors and easily track overarching company success metrics.</p>
<p>Our accomplished teams continuously facilitate our expansion by tackling the ever-evolving challenges tied to scaling, reliability, design, and service.</p>
<p>Choosing to work at Mixpanel means you’ll be helping the world’s most innovative companies learn from their data so they can make better decisions.</p>
<p>Mixpanel is an equal opportunity employer supporting workforce diversity.</p>
<p>At Mixpanel, we are focused on things that really matter,our people, our customers, our partners,out of a recognition that those relationships are the most valuable assets we have.</p>
<p>We actively encourage women, people with disabilities, veterans, underrepresented minorities, and LGBTQ+ people to apply.</p>
<p>We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity or expression, sexual orientation, age, marital status, or any other protected characteristic.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data architecture, governance, instrumentation, business outcomes, data querying, modeling, transforming, SQL, dbt, Python, Business Intelligence tools, Product Analytics tools, databases, cloud data warehouses, Google Cloud, Amazon Redshift, Microsoft Azure, Snowflake, Databricks, SDKs, Customer Data Platforms, Event Streaming, Reverse ETL</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mixpanel</Employername>
      <Employerlogo>https://logos.yubhub.co/mixpanel.com.png</Employerlogo>
      <Employerdescription>Mixpanel is a leading provider of digital analytics software, serving over 29,000 companies worldwide.</Employerdescription>
      <Employerwebsite>https://mixpanel.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mixpanel/jobs/7506821</Applyto>
      <Location>Bengaluru, India (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>53ee0ef3-c62</externalid>
      <Title>Staff Data Engineer, Analytics Data Engineering</Title>
      <Description><![CDATA[<p>We are looking for a Staff Data Engineer to join our Analytics Data Engineering (ADE) team within Data Science &amp; AI Platform. As a Staff Data Engineer, you will be responsible for solving cross-cutting data challenges that span multiple lines of business while driving standardization in how we build, deploy, and govern analytics pipelines across Dropbox.</p>
<p>This is not a maintenance role. We are modernizing our analytics platform, upgrading orchestration infrastructure, building shared and reusable data models with conformed dimensions, establishing a certified metrics framework, and laying the foundation for AI-native data development. You will partner closely with Data Science, Data Infrastructure, Product Engineering, and Business Intelligence teams to make this happen.</p>
<p>You will play a crucial role in establishing analytics engineering standards, designing scalable data models, and driving cross-functional alignment on data governance. You will get substantial exposure to senior leadership, shape the technical direction of analytics infrastructure at Dropbox, and directly influence how data powers product and business decisions.</p>
<p>Responsibilities:</p>
<ul>
<li>Lead the design and implementation of shared, reusable data models, defining shared fact tables, conformed dimensions, and a semantic/metrics layer that serves as the single source of truth across analytics functions</li>
</ul>
<ul>
<li>Drive standardization of data engineering practices across ADE and functional analytics teams, including pipeline patterns, CI/CD workflows, naming conventions, and data modeling standards</li>
</ul>
<ul>
<li>Partner with Data Infrastructure to modernize orchestration, improve pipeline decomposition, and establish secure dev/test environments with production data access</li>
</ul>
<ul>
<li>Architect and implement a shift-left data governance strategy, working with upstream data producers to establish data contracts, SLOs, and code-enforced quality gates that catch issues before production</li>
</ul>
<ul>
<li>Collaborate with Data Science leads and Product Management to translate metric definitions into reliable, certified data pipelines that power executive dashboards, WBR reporting, and growth measurement</li>
</ul>
<ul>
<li>Reduce operational burden by improving pipeline granularity, observability, and failure recovery, establishing runbooks and alerting standards that make on-call sustainable</li>
</ul>
<ul>
<li>Evaluate and integrate AI-native tooling into the data development lifecycle, enabling conversational data exploration with guardrails and AI-assisted pipeline development</li>
</ul>
<p>Requirements:</p>
<ul>
<li>BS degree in Computer Science or related technical field, or equivalent technical experience</li>
</ul>
<ul>
<li>12+ years of experience in data engineering or analytics engineering with increasing scope and technical leadership</li>
</ul>
<ul>
<li>12+ years of SQL experience, including complex analytical queries, window functions, and performance optimization at scale (Spark SQL)</li>
</ul>
<ul>
<li>8+ years of Python development experience, including building and maintaining production data pipelines</li>
</ul>
<ul>
<li>Deep expertise in dimensional data modeling, schema design, and scalable data architecture, with hands-on experience building shared data models across multiple business domains</li>
</ul>
<ul>
<li>Strong experience with orchestration tools (Airflow strongly preferred) and dbt, including pipeline design, scheduling strategies, and failure recovery patterns</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Experience with Databricks (Unity Catalog, Delta Lake) and modern lakehouse architectures</li>
</ul>
<ul>
<li>Experience leading orchestration or platform modernization efforts at scale</li>
</ul>
<ul>
<li>Familiarity with data governance and observability tools such as Atlan, Monte Carlo, Great Expectations, or similar</li>
</ul>
<ul>
<li>Experience building or contributing to a metrics/semantic layer (dbt MetricFlow, Databricks Metric Views, or equivalent)</li>
</ul>
<ul>
<li>Track record of establishing data engineering standards and best practices in a federated analytics organization</li>
</ul>
<p>Compensation:</p>
<p>US Zone 2 $198,900-$269,100 USD</p>
<p>US Zone 3 $176,800-$239,200 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$198,900-$269,100 USD</Salaryrange>
      <Skills>SQL, Python, Dimensional data modeling, Schema design, Scalable data architecture, Orchestration tools, dbt, Databricks, Modern lakehouse architectures, Data governance and observability tools, Metrics/semantic layer</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Dropbox</Employername>
      <Employerlogo>https://logos.yubhub.co/dropbox.com.png</Employerlogo>
      <Employerdescription>Dropbox is a technology company that provides cloud storage and file sharing services.</Employerdescription>
      <Employerwebsite>https://www.dropbox.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/dropbox/jobs/7595183</Applyto>
      <Location>Remote - US: Select locations</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>2f962d3f-14e</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461218002</Applyto>
      <Location>Dallas, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>d8a17638-e52</externalid>
      <Title>Account Executive NATO</Title>
      <Description><![CDATA[<p>Elastic, the Search AI company, is looking for an Enterprise Account Executive to drive net-new revenue and expansion within our developing relationship with NATO, Brussels. You&#39;ll be the owner of this unique customer where you&#39;ll build your own pipeline and close engagements - telling the Elastic Search AI story, and close complex, multi-stakeholder deals in a consumption-based model.</p>
<p>As an Enterprise Account Executive, you will:</p>
<ul>
<li>Own your customer &amp; build pipeline and close any ongoing engagements.</li>
<li>Deep discovery &amp; qualification: Uncover pain, business impact, budget, and decision criteria using frameworks like MEDDPICC so you chase only the highest-confidence deals.</li>
<li>Value storytelling &amp; demos: Craft and deliver tailored narratives and live demos that map Elastic’s Search, Observability, and Security capabilities to measurable business outcomes.</li>
<li>Mutual deal strategy &amp; forecast accuracy: Collaborate with your customer to build formal close plans and keep your CRM up-to-date.</li>
<li>Executive negotiation &amp; closing: Lead high-stakes contract and pricing discussions,defend your value, structure give/get trades, and land multi-year consumption commitments.</li>
<li>Domain &amp; cloud acumen: Position Elastic as the Search AI platform of choice by speaking fluently about cloud economics, usage-based pricing, and modern data architectures.</li>
<li>Cross-functional partnership: Work hand-in-glove with Solutions Architects, Customer Success, Marketing, and RevOps to accelerate deals and drive exceptional customer outcomes.</li>
</ul>
<p>We&#39;re looking for someone with:</p>
<ul>
<li>Proven experience of working with or for NATO with existing relationships.</li>
<li>Expert discovery &amp; qualification skills: Demonstrated ability to apply MEDDPICC or equivalent frameworks to drive disciplined pipeline and eliminate low-probability deals.</li>
<li>Compelling value storytellers: Track record of delivering executive-level presentations and demos that tie product capabilities to real dollars saved, revenue gained, or risk mitigated.</li>
<li>Technical &amp; cloud fluency: Comfortable discussing a broad range of technical topics including observability, security, vector/traditional search, and cloud cost optimization.</li>
<li>Collaborative mindset &amp; coachability: A learner who partners effectively with internal teams, incorporates feedback, and embodies Elastic’s values of community and openness.</li>
<li>Open Source enthusiasm: Genuine appreciation for open-source communities and the Elastic model.</li>
</ul>
<p>Bonus Points:</p>
<ul>
<li>Prior experience of projects with geospatial content.</li>
<li>Familiarity with observability (logs, metrics, traces) or security analytics (SIEM/XDR) use cases.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>MEDDPICC, Search AI, Observability, Security, Cloud economics, Usage-based pricing, Modern data architectures, Cloud cost optimization, Geospatial content, Observability (logs, metrics, traces), Security analytics (SIEM/XDR)</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Elastic</Employername>
      <Employerlogo>https://logos.yubhub.co/elastic.co.png</Employerlogo>
      <Employerdescription>Elastic is a software company that develops and distributes technology for search, security, and observability. It has a global presence with customers across various industries.</Employerdescription>
      <Employerwebsite>https://www.elastic.co/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/elastic/jobs/7668021</Applyto>
      <Location>Belgium</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0036f074-845</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have: Databricks Certification</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, design and deployment of highly performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456966002</Applyto>
      <Location>Boston, Massachusetts</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>40db054b-06d</externalid>
      <Title>Senior Product Manager, Access</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Senior Technical Product Manager to join our Access team within the Acuity Scheduling department. As a Senior Technical Product Manager for Acuity Scheduling, you&#39;ll own the systems that control how customers sign in, manage identity, and pay for the platform.</p>
<p>This is a hybrid role working 3 days per week from our Aveiro office. You will report to the Group Product Manager on the Acuity Scheduling team.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Product Ownership: Act as the primary product owner for your team, setting the roadmap and priorities based on technical feasibility, user impact, and business goals.</li>
<li>Technical Strategy &amp; Roadmap: Collaborate with engineers to define a scalable technical strategy that aligns with product goals, focusing on data architecture, systems design, and service-based solutions.</li>
<li>Cross-functional Collaboration: Partner with engineering, data science, and UX teams to understand requirements, manage trade-offs, and deliver solutions that balance speed and scalability.</li>
<li>Cross-organization Collaboration: Work directly with Squarespace Identity and Security teams to develop and realise a shared vision of a singular identity and authentication system.</li>
<li>Stakeholder Communication: Translate technical architecture and system requirements into clear, actionable items for stakeholders across the company, including senior leadership and non-technical teams.</li>
<li>Quality, Security &amp; Performance Optimisation: Focus on the system&#39;s stability, reliability, and scalability by working closely with the engineering team on continuous improvement, platform security and technical debt management.</li>
<li>Architecture Oversight: Guide architectural decisions to ensure optimal security, data flow, storage, and access within our product ecosystem. Advocate for sustainable choices in a service-oriented approach to component-based architecture.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Experience: 4-6 years in product management or a technical role with product ownership, preferably within a data-driven environment; ideally in an identity-centric role.</li>
<li>Technical Expertise: Strong background as the technical product lead on teams owning data architecture, systems design, and service-based architecture (e.g., microservices), with the ability to engage deeply in technical discussions and decisions.</li>
<li>Systems Thinking: Proven experience in end-to-end system thinking and design, including a strong grasp of component-based architectures, data storage options, and integration layers.</li>
<li>Data Architecture: Hands-on experience with data modeling, database design, and data warehousing principles, including familiarity with large data model improvement initiatives.</li>
<li>APIs &amp; Integration: Understanding of RESTful API design, OAuth, identity federation, and integration patterns to ensure seamless interoperability between services and systems.</li>
<li>Analytical Mindset: Proficiency in using data to inform decisions autonomously, including experience with data analysis and product analytics tools.</li>
<li>Communication Skills: Ability to communicate complex technical concepts to both technical and non-technical audiences, bridging the gap between product vision and technical execution.</li>
<li>Agile Experience: Familiarity with Agile methodologies, including backlog management, sprint planning, and cross-functional team collaboration.</li>
<li>Project Management: Familiarity with project management tools like Jira, Asana, or similar.</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Technical Transformation: Experience leading teams and/or organisations from a monolithic architecture to a service-oriented one.</li>
<li>Identity Experience: High-level understanding of and experience with core user registration flows and OAuth as it pertains to user identity needs.</li>
<li>Technical Documentation: Experience documenting technical data architecture, service flows, and system dependencies to ensure alignment and knowledge-sharing within the team</li>
</ul>
<p><strong>Benefits &amp; Perks</strong></p>
<ul>
<li>Health insurance with 100% covered premiums for you, your spouse or partner, and dependent children, including medical, dental, and vision</li>
<li>Life and Disability Insurance</li>
<li>Pension benefits with employer match</li>
<li>Fertility and adoption benefits</li>
<li>Headspace mindfulness app subscription</li>
<li>Global Employee Assistance Program</li>
<li>Statutory paid time off and all statutory leaves, as required</li>
<li>Meal Allowance and Flex Benefits Account</li>
<li>Employee donation match to community organisations</li>
<li>In the easily accessible city centre of Aveiro</li>
<li>7 Global Employee Resource Groups (ERGs)</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data architecture, systems design, service-based architecture, microservices, data modeling, database design, data warehousing, RESTful API design, OAuth, identity federation, integration patterns, data analysis, product analytics tools, Agile methodologies, backlog management, sprint planning, cross-functional team collaboration, project management tools, Jira, Asana</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Squarespace</Employername>
      <Employerlogo>https://logos.yubhub.co/squarespace.com.png</Employerlogo>
      <Employerdescription>Squarespace is a design-driven platform helping entrepreneurs build brands and businesses online. It has a team of over 1,700 employees and operates in more than 200 countries.</Employerdescription>
      <Employerwebsite>https://www.squarespace.com/about/careers</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/squarespace/jobs/7698954</Applyto>
      <Location>Aveiro</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>76defb4f-35e</externalid>
      <Title>Head of Finance Systems &amp; Automation</Title>
      <Description><![CDATA[<p><strong>About the Role</strong></p>
<p>The Head of Financial Systems will own the strategy, implementation, and continuous improvement of Scale AI&#39;s finance technology stack.</p>
<p>You are a builder-first leader who blends enterprise architecture with AI automation - eliminating manual workflows, tightening the financial data fabric, and enabling the Finance team to move with the speed and rigor the business demands.</p>
<p>You will be the primary technology partner to the Finance leadership team, and the person ultimately accountable for the reliability, scalability, and intelligence of every system that touches the financial close, revenue, and spend lifecycle.</p>
<p><strong>Key Responsibilities</strong></p>
<p><strong>ERP Ownership &amp; Optimization</strong></p>
<p>Own the full lifecycle of our ERP environment (NetSuite or equivalent), including architecture, configuration, integrations, and roadmap.</p>
<p>Ensure the platform reliably supports the General Ledger, AP, AR, Procure-to-Pay, Fixed Assets, Revenue Recognition, Sales Audit, and International Consolidations &amp; Reporting.</p>
<p><strong>Order-to-Cash &amp; Procure-to-Pay</strong></p>
<p>Drive end-to-end process excellence across Order-to-Cash (Sales/Revenue Capture, Sales Audit, Billing) and Procure-to-Pay (Direct &amp; Indirect Purchasing).</p>
<p>Identify gaps between upstream systems and ERP, and close them through disciplined integration design.</p>
<p><strong>AI Agent Deployment</strong></p>
<p>Identify opportunities to replace manual finance workflows with AI/LLM-powered agents.</p>
<p>Build and manage internal agents that automate forecasting inputs, variance analysis, close task management, and reconciliation , reducing cycle time and surfacing actionable insights for Finance leadership.</p>
<p><strong>Systems Integration &amp; Data Architecture</strong></p>
<p>Design scalable data flows and APIs that connect the ERP to adjacent finance tools (billing, expense, FP&amp;A, treasury) into a cohesive, auditable ecosystem.</p>
<p>Ensure clean, governed data movement from source systems to the general ledger and reporting layer.</p>
<p><strong>Operational Excellence &amp; Controls</strong></p>
<p>Establish disciplined change management, clear data governance, and measurable SLAs across all financial systems.</p>
<p>Stabilize environments to support audit readiness, SOX compliance, and regulatory requirements.</p>
<p><strong>Finance Stakeholder Partnership</strong></p>
<p>Serve as the primary technology partner to the CFO, Controller, FP&amp;A, and Accounting teams.</p>
<p>Translate complex technical constraints into business outcomes and align system roadmaps with Finance&#39;s strategic priorities.</p>
<p><strong>Team Leadership</strong></p>
<p>Build and lead a high-impact Finance Systems team.</p>
<p>Foster a culture of curiosity, speed, and user-centricity , where the team is as proud of a clean reconciliation workflow as any external product shipped.</p>
<p><strong>What You&#39;ll Bring</strong></p>
<ul>
<li>8+ years of experience leading finance systems or enterprise application functions in a fast-paced, high-growth environment</li>
</ul>
<ul>
<li>Deep, hands-on knowledge of ERP platforms (NetSuite strongly preferred) across core modules: GL, AP, AR, Fixed Assets, Revenue Recognition, Procure-to-Pay, Sales Audit, and International Consolidations</li>
</ul>
<ul>
<li>Strong understanding of Order-to-Cash and Procure-to-Pay process design, including the touchpoints between upstream sales/procurement systems and the ERP</li>
</ul>
<ul>
<li>Strong understanding of Sox IT General controls and track record implementing them</li>
</ul>
<ul>
<li>Proven track record designing and deploying AI/LLM-powered workflows or agentic systems to improve Finance team efficiency</li>
</ul>
<ul>
<li>Ability to design and govern scalable integrations and data flows across SaaS tools</li>
</ul>
<ul>
<li>Experience influencing Finance leadership, translating technical complexity into clear business decisions</li>
</ul>
<ul>
<li>Proficiency in an iPaaS platform (e.g., Workato, Mulesoft, Boomi, etc.)</li>
</ul>
<ul>
<li>Workato Certification is a big plus</li>
</ul>
<ul>
<li>Proficiency in Python or SQL to personally prototype, audit, and quality-check automation logic</li>
</ul>
<ul>
<li>Familiarity with adjacent Finance tooling: expense management, FP&amp;A platforms, billing systems, and treasury tools</li>
</ul>
<p><strong>Compensation</strong></p>
<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits.</p>
<p>The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training.</p>
<p>Scale employees in eligible roles are also granted equity-based compensation, subject to Board of Director approval.</p>
<p>Your recruiter can share more about the specific salary range for your preferred location during the hiring process, and confirm whether the hired role will be eligible for equity grant.</p>
<p>You’ll also receive benefits including, but not limited to:</p>
<ul>
<li>Comprehensive health, dental, and vision coverage</li>
</ul>
<ul>
<li>Retirement benefits</li>
</ul>
<ul>
<li>A learning and development stipend</li>
</ul>
<ul>
<li>Generous PTO</li>
</ul>
<p>Additionally, this role may be eligible for additional benefits such as a commuter stipend.</p>
<p><strong>Base Salary Range</strong></p>
<p>The base salary range for this full-time position in the location of San Francisco is $198,400-$248,000 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$198,400-$248,000 USD</Salaryrange>
      <Skills>ERP platforms, NetSuite, Order-to-Cash, Procure-to-Pay, AI/LLM-powered agents, Systems integration, Data architecture, Operational excellence, SOX compliance, Regulatory requirements, iPaaS platform, Python, SQL, Expense management, FP&amp;A platforms, Billing systems, Treasury tools</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>Scale AI</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale AI develops reliable AI systems for the world&apos;s most important decisions.</Employerdescription>
      <Employerwebsite>https://www.scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4683055005</Applyto>
      <Location>San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>19182c1d-b27</externalid>
      <Title>Solutions Architect - UAE</Title>
      <Description><![CDATA[<p>At Databricks, our core values are at the heart of everything we do; creating a culture of proactiveness and a customer-centric mindset guides us to create a unified platform that makes data science and analytics accessible to everyone.</p>
<p>We aim to inspire our customers to make informed decisions that push their business forward. We provide a user-friendly and intuitive platform that makes it easy to turn insights into action and fosters a culture of creativity, experimentation, and continuous improvement.</p>
<p>As a Solutions Architect in the UAE Pre-Sales team, you will be an essential part of this mission, using your technical expertise to demonstrate how our Data Intelligence Platform can help customers solve their complex data challenges.</p>
<p>You&#39;ll work with a collaborative, customer-focused team that values innovation and creativity, using your skills to create customised solutions to help our customers achieve their goals and guide their businesses forward.</p>
<p>Join us in our quest to change how people work with data and make a better world!</p>
<p>The impact you will have:</p>
<ul>
<li>Create impactful and successful relationships with customer accounts in the United Arab Emirates, providing technical and business value to Databricks customers in collaboration with the extended team.</li>
</ul>
<ul>
<li>Become the trusted advisor of your customer on the Data and AI landscape by successfully driving and delivering the adoption of the Databricks Data Intelligence Platform.</li>
</ul>
<ul>
<li>Enabling Partners and support internal events in the MEA region.</li>
</ul>
<ul>
<li>Scale best practices in your field by authoring reference architectures, how-tos, and demo applications, and help build the Databricks community in your region by leading workshops, seminars, and meet-ups.</li>
</ul>
<ul>
<li>Grow your knowledge and expertise to the level of a technical and/or industry specialist.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Experienced in customer interactions in a technical pre-sales capacity and adept in managing complex sales lifecycles.</li>
</ul>
<ul>
<li>Experienced in use case discovery, scoping, and delivering complex solution architecture designs to multiple audiences, requiring an ability to switch context and/or levels of technical depth.</li>
</ul>
<ul>
<li>Ability to provide technical solutions for specialised customer needs, navigate a competitive landscape and effectively develop relationships to achieve long-term customer success.</li>
</ul>
<ul>
<li>Hands-on expertise with complex Big Data architecture design for public cloud platform(s) solutions, focusing on use cases in Data Warehousing and Data Engineering architecture and implementation.</li>
</ul>
<p>Data Science and Machine Learning skills will be advantageous.</p>
<ul>
<li>Prior experience with coding in a core programming language (i.e., Python, SQL etc.) and willingness to learn Apache Spark™.</li>
</ul>
<ul>
<li>Experience and skills on the Databricks platform will be highly advantageous for the role!</li>
</ul>
<ul>
<li>Excellent communication skills in English required as a minimum. Fluency in Arabic will be highly preferable for the position.</li>
</ul>
<p>Key Notes:</p>
<ul>
<li>Location for the role will be in Paris (i.e. within a commutable distance for a hybrid schedule).</li>
</ul>
<ul>
<li>You will need to be flexible and willing to travel to the United Arab Emirates for customer visits on a regular basis (i.e. up to ~2 weeks per month).</li>
</ul>
<ul>
<li>We are seeking a candidate that will be interested in a future relocation to the region (Dubai) when an office is opened.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>customer interactions, technical pre-sales capacity, complex sales lifecycles, use case discovery, solution architecture designs, Big Data architecture design, public cloud platform(s), Data Warehousing, Data Engineering, Apache Spark, Python, SQL, Data Science, Machine Learning, Arabic</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data science and analytics.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8287419002</Applyto>
      <Location>Paris, France</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>4ea7999b-3d8</externalid>
      <Title>Resident Solutions Architect - Healthcare &amp; Life Sciences</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494145002</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>90cf972f-2cf</externalid>
      <Title>Senior Data Analyst – Insights &amp; Analytics (Revenue Operations)</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Senior Data Analyst to join the Insights &amp; Analytics team at Elastic. You&#39;ll help shape how our global Revenue teams use data to make smart decisions, plan for growth, and stay focused on what matters.</p>
<p>This role is a mix of strategy, hands-on analysis, and cross-team collaboration. You&#39;ll work closely with Sales, Customer Success, Marketing, Finance, and more-bringing data to life and helping teams see the story behind the numbers.</p>
<p>We work across a wide range of tools and datasets -from dashboards and forecasts to detailed analytical deep dives - helping the business stay focused, aligned, and data-informed.</p>
<p>To support our growth and enable us to scale efficiently, we are seeking an exceptional Senior Data Analyst to drive sales strategy, planning, reporting, and analysis efforts.</p>
<p>In this position, you will play a strategic role in driving data-informed decision-making across Elastic’s Global Revenue Operations organization and broader go-to-market ecosystem.</p>
<p>You will work on high-impact analysis and develop scalable, leadership-level reporting to support sales effectiveness, pipeline optimization, and revenue growth.</p>
<p>You’ll use your strong analytical skills to break down complex business problems and help teams make smarter decisions.</p>
<p>Your insights will shape how we plan, operate, and improve over time.</p>
<p><strong>What You’ll Be Doing</strong></p>
<ul>
<li>Build clean, scalable dashboards and tools using SQL (BigQuery), dbt, and Tableau</li>
</ul>
<ul>
<li>Analyze complex data to answer key business questions-and turn insights into action</li>
</ul>
<ul>
<li>Handle ad hoc asks in Google Sheets, while staying focused on big-picture, long-term impact</li>
</ul>
<ul>
<li>Support senior stakeholders with clear, accurate reporting for exec and board-level needs</li>
</ul>
<ul>
<li>Question assumptions and get to the root of the problem, not just the request</li>
</ul>
<ul>
<li>Validate your work thoroughly and explore data anomalies with curiosity</li>
</ul>
<p><strong>Working Independently, While Staying Connected</strong></p>
<ul>
<li>Take ownership of projects from start to finish - managing your own scope, priorities, and timelines</li>
</ul>
<ul>
<li>Collaborate across time zones and teams (Sales, Field Ops, Data Engineering, and more) to ensure alignment and data consistency across data sources and reporting</li>
</ul>
<ul>
<li>Spot data issues early and partner with the right folks to fix them at the source</li>
</ul>
<ul>
<li>Help keep our reporting consistent and aligned across tools and teams</li>
</ul>
<p><strong>Learning, Growing, and Making an Impact</strong></p>
<ul>
<li>Build real-world experience in Revenue Operations while learning how the business runs</li>
</ul>
<ul>
<li>Lead high-impact projects that shape go-to-market strategy</li>
</ul>
<ul>
<li>Grow your skills in areas like predictive analytics, data architecture, and business planning</li>
</ul>
<ul>
<li>Work directly with senior stakeholders and build strong relationships across the company</li>
</ul>
<ul>
<li>Be part of a team where your ideas and work make a visible difference</li>
</ul>
<p><strong>What You Bring</strong></p>
<ul>
<li>4+ years of experience in data analytics, BI, or a similar role - ideally in a high-impact, fast-paced environment</li>
</ul>
<ul>
<li>Strong SQL skills (BigQuery preferred); experience with dbt is a plus</li>
</ul>
<ul>
<li>Proficient with data visualization tools like Tableau or Power BI. Experience with predictive analytics is a plus.</li>
</ul>
<ul>
<li>Experience working with Salesforce or similar sales data tools</li>
</ul>
<ul>
<li>Comfortable working in Google Sheets to support quick turnaround requests</li>
</ul>
<ul>
<li>Familiarity with B2B SaaS and a solid understanding of sales or post-sales data</li>
</ul>
<ul>
<li>Experienced in managing complex projects with clarity and focus - you know how to prioritize, follow through, and get unblocked when needed</li>
</ul>
<ul>
<li>Clear, proactive communicator who can explain complex ideas simply and help others make informed decisions</li>
</ul>
<p>You’ll join a remote-friendly, team that values curiosity, clarity, and action to deliver impact to the business. You’ll have room to grow, freedom to explore, and the support you need to do your best work- while learning how data helps shape every part of our business.</p>
<p><strong>Additional Information</strong></p>
<ul>
<li>We Take Care of Our People</li>
</ul>
<p>As a distributed company, diversity drives our identity. Whether you’re looking to launch a new career or grow an existing one, Elastic is the type of company where you can balance great work with great life.</p>
<p>Your age is only a number. It doesn’t matter if you’re just out of college or your children are; we need you for what you can do.</p>
<p>We strive to have parity of benefits across regions and while regulations differ from place to place, we believe taking care of our people is the right thing to do.</p>
<ul>
<li>Competitive pay based on the work you do here and not your previous salary</li>
</ul>
<ul>
<li>Health coverage for you and your family in many locations</li>
</ul>
<ul>
<li>Ability to craft your calendar with flexible locations and schedules for many roles</li>
</ul>
<ul>
<li>Generous number of vacation days each year</li>
</ul>
<ul>
<li>Increase your impact</li>
</ul>
<p>We match up to $2000 (or local currency equivalent) for financial donations and service</p>
<p>Up to 40 hours each year to use toward volunteer projects you love</p>
<p>Embracing parenthood with minimum of 16 weeks of parental leave</p>
<p>Different people approach problems differently. We need that.</p>
<p>Elastic is an equal opportunity employer and is committed to creating an inclusive culture that celebrates different perspectives, experiences, and backgrounds.</p>
<p>Qualified applicants will receive consideration for employment without regard to race, ethnicity, color, religion, sex, pregnancy, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, disability status, or any other basis protected by federal, state or local law, ordinance or regulation.</p>
<p>We welcome individuals with disabilities and strive to create an accessible and inclusive experience for all individuals.</p>
<p>To request an accommodation during the application or the recruiting process, please email candidate_accessibility@elastic.co.</p>
<p>We will reply to your request within 24 business hours of submission.</p>
<p>Applicants have rights under Federal Employment Laws, view posters linked below:</p>
<p>Family and Medical Leave Act (FMLA) Poster;</p>
<p>Pay Transparency Nondiscrimination Provision Poster;</p>
<p>Employee Polygraph Protection Act (EPPA) Poster and Know Your Rights (Poster)</p>
<p>Elastic develops and distributes technology and information that is subject to U.S. and other countries’ export controls and licensing requirements for individuals who are located in or are nationals of the following sanctioned countries and regions: Belarus, Cuba, Iran, North Korea, Syria, or Russia, including the Ukrainian territories annexed by Russia (The Crimea region of Ukraine, The Donetsk People&#39;s Republic (DNR), The Luhansk People&#39;s Republic (LNR), Kherson or Zaporizhzhia).</p>
<p>If you are located in or are a national of one of the listed countries or regions, an export license may be required as a condition of your employment in this role.</p>
<p>Please note that national origin and/or nationality do not affect eligibility for employment with Elastic.</p>
<p>Please see here for our Privacy Statement.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, BigQuery, dbt, Tableau, data visualization, predictive analytics, data architecture, business planning, Salesforce, Google Sheets</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Elastic</Employername>
      <Employerlogo>https://logos.yubhub.co/elastic.co.png</Employerlogo>
      <Employerdescription>Elastic is a software company that develops and distributes technology for search, security, and observability.</Employerdescription>
      <Employerwebsite>https://www.elastic.co/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/elastic/jobs/7601880</Applyto>
      <Location>Barcelona, Spain</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>e2f537b7-0f0</externalid>
      <Title>Delivery Solutions Architect</Title>
      <Description><![CDATA[<p>At Databricks, we are on a mission to empower our customers to solve the world&#39;s toughest data problems with the Databricks Data Intelligence Platform.</p>
<p>As a Delivery Solutions Architect (DSA), you are a trusted technical advisor to key customers, providing expert guidance that translates data, analytics, and AI challenges into high-impact business value.</p>
<p>You help design, implement, and scale data and AI solutions, focusing on architecture, operational excellence, and customer enablement.</p>
<p>Internally, you will collaborate with our sales and field engineering teams to accelerate the adoption and growth of the Databricks Platform in your customers.</p>
<p>Delivery Solutions Architects (DSAs) are trusted technical advisors embedded within the customer organization, providing expert guidance that translates data and AI challenges into high-impact business value.</p>
<p>They help you design, implement, and scale data and AI solutions, focusing on architecture, operational excellence, and team enablement.</p>
<p>DSAs focus on:</p>
<ul>
<li>Designing secure, scalable architecture</li>
</ul>
<ul>
<li>Aligning people, processes, and technology</li>
</ul>
<ul>
<li>Establishing trusted advisor relationships</li>
</ul>
<ul>
<li>Leveraging the broader ecosystem of Databricks experts</li>
</ul>
<p>This is a hybrid technical and commercial role.</p>
<p>Technically, the expectations are that you become the post-sales technical lead and trusted advisor across all Databricks products for the customer&#39;s top priority use cases.</p>
<p>This requires you to use your technical skills and credibility to engage and communicate with technical/technical leadership stakeholders in our customer organizations, do architecture reviews, help with performance and cost optimizations, demonstrate new capabilities, remove blockers, etc.</p>
<p>In parallel, it is commercial in the sense that you will drive growth in your assigned customers and use cases through leading your customers&#39; stakeholders, building executive relationships, orchestrating other focused/specialized teams within Databricks, and creating and driving onboarding plans.</p>
<p>While not a hands-on-keyboard role, this is a highly technical position where architectural skills in fields such as Data Architecture, Data Engineering, Data Warehousing, or Data Science are essential.</p>
<p>You will report directly to a DSA Manager within the Field Engineering organization.</p>
<p>The impact you will have:</p>
<ul>
<li>Be the Databricks Architect working with customer technical teams working on use cases/data products, from development to go-live, addressing any technical challenges and blockers and providing guidance, best practices, and enablement</li>
</ul>
<ul>
<li>Lead the post-technical win technical account strategy and execution plan for the majority of Databricks use cases within our most strategic accounts</li>
</ul>
<ul>
<li>Be the internal point of contact for any questions related to production/go live status of agreed-upon use cases within an account, often for multiple use cases within the largest and most complex organizations</li>
</ul>
<ul>
<li>Leverage both Shared Services, User Education, Onboarding/Technical Services, and Support resources, along with escalating to expert-level technical teams to address the tasks that are beyond your scope of activities or expertise</li>
</ul>
<ul>
<li>Create and execute a point-of-view as to how key use cases can be accelerated into production, coordinating with Professional Services (PS) resources on the delivery of PS Engagement proposals</li>
</ul>
<ul>
<li>Navigate Databricks Product and Engineering teams for new product innovations, private previews, and upgrade needs</li>
</ul>
<ul>
<li>Develop an execution plan that covers all activities of all customer-facing technical roles and teams to cover the below work streams:</li>
</ul>
<ul>
<li>Main use cases moving from &#39;win&#39; to production</li>
</ul>
<ul>
<li>Enablement/user growth plan</li>
</ul>
<ul>
<li>Product adoption (strategy and activities to increase adoption of Databricks&#39; Lakehouse vision)</li>
</ul>
<ul>
<li>Organic needs for current investment (e.g., cloud cost control, tuning &amp; optimization)</li>
</ul>
<ul>
<li>Executive and operational governance</li>
</ul>
<ul>
<li>Provide internal and external updates</li>
</ul>
<ul>
<li>KPI reporting on the status of usage and customer health, covering investment status, important risks, product adoption, and use case progression</li>
</ul>
<ul>
<li>to your Technical GM</li>
</ul>
<ul>
<li>Navigate Databricks Product and Engineering teams for new product innovations, private previews, and upgrade needs, presenting them to the customers when applicable for their ongoing developments</li>
</ul>
<ul>
<li>internal and external updates</li>
</ul>
<ul>
<li>KPI reporting on the status of usage and customer health, risks, and blockers, product adoption, and use case progression</li>
</ul>
<ul>
<li>to your Field Engineering leadership</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6-10 years of experience where you have been accountable for delivery of projects in Data, Analytics, or AI and where you can contribute to technical debate and design choices with customers</li>
</ul>
<ul>
<li>Programming experience in PySpark, SQL, or Scala</li>
</ul>
<ul>
<li>Understanding and hands-on experience of solution architecture-related distributed data and analytics systems</li>
</ul>
<ul>
<li>Experience in a customer-facing pre-sales, technical architecture, customer success, or consulting roles</li>
</ul>
<ul>
<li>Understanding of how to attribute business value and outcomes to specific project deliverables</li>
</ul>
<ul>
<li>Technical program coordination including account and stakeholder management</li>
</ul>
<ul>
<li>Experience resolving complex and important escalation with senior customer technical stakeholders</li>
</ul>
<ul>
<li>Track record of overachievement against quota, goals, or similar objective targets</li>
</ul>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience</li>
</ul>
<ul>
<li>Can travel up to 30%</li>
</ul>
<p>About Databricks</p>
<p>Databricks is the data and AI company. More than 10,000 organizations worldwide , including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 , rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics, and AI.</p>
<p>Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake, and MLflow.</p>
<p>To learn more, follow Databricks on Twitter, LinkedIn, and Facebook.</p>
<p>Benefits</p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.</p>
<p>Our Commitment to Diversity and Inclusion</p>
<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.</p>
<p>Compliance</p>
<p>If access to export-controlled technology or source code is required for performance of job duties, it is within Employer&#39;s discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>PySpark, SQL, Scala, Data Architecture, Data Engineering, Data Warehousing, Data Science</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a platform for unifying and democratizing data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8368003002</Applyto>
      <Location>Remote - Italy</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>85f1f87e-70f</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
</ul>
<ul>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have: Databricks Certification</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461327002</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6aab7ed8-23a</externalid>
      <Title>Senior Software Engineer - Data</Title>
      <Description><![CDATA[<p>We are seeking an experienced Senior Software Engineer (Data) to join our fast-paced, collaborative data team. In this role, you will have broad authority to drive the direction of our technographic data services, building world-class data pipelines and systems to process billions of signals and data points.</p>
<p>This is an exciting opportunity to solve challenging problems and make a big impact as we invest in making technographics a first-class offering.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Build and optimize big data pipelines to extract and process signals from the web, job postings, and other sources</li>
<li>Design and implement data architectures and storage solutions to efficiently handle massive data volumes</li>
<li>Collaborate closely with data scientists to support and integrate ML models into data workflows</li>
<li>Continuously improve data quality, performance, and scalability of our technographic data platform</li>
<li>Drive technical strategy and roadmap for the data processing infrastructure</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Extensive experience building and scaling big data pipelines and architectures from scratch</li>
<li>Deep expertise in big data frameworks (Hadoop, Spark) and the JVM stack (Java, Scala)</li>
<li>Strong software engineering fundamentals and ability to write efficient, high-quality code</li>
<li>Experience with entity recognition and NLP techniques a plus</li>
<li>Proven track record delivering results and driving projects in a fast-paced environment</li>
<li>Excellent collaboration and communication skills to work with data scientists, analysts and product teams</li>
<li>Passion for leveraging huge datasets to power valuable insights</li>
</ul>
<p>Ideal Background:</p>
<ul>
<li>8+ years of experience in software engineering roles</li>
<li>Experience working with very large datasets and distributed systems</li>
<li>Familiarity building data pipelines at large tech companies or data-driven organisations</li>
<li>Bachelor&#39;s or advanced degree in Computer Science, Engineering or related technical field</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$140,000-$220,000 USD</Salaryrange>
      <Skills>big data pipelines, data architectures, storage solutions, ML models, data quality, performance, scalability, data processing infrastructure, Hadoop, Spark, Java, Scala, entity recognition, NLP techniques</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>ZoomInfo</Employername>
      <Employerlogo>https://logos.yubhub.co/zoominfo.com.png</Employerlogo>
      <Employerdescription>ZoomInfo is a technology company that provides a go-to-market intelligence platform for businesses.</Employerdescription>
      <Employerwebsite>https://www.zoominfo.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/zoominfo/jobs/8486808002</Applyto>
      <Location>Bethesda, Maryland, United States; Waltham, Massachusetts, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>760c3e88-e35</externalid>
      <Title>Senior Product Manager, Data</Title>
      <Description><![CDATA[<p>Job Title: Senior Product Manager, Data</p>
<p>We are seeking a Senior Product Manager to support the development of CoreWeave&#39;s Enterprise Data Platform within the CIO organization. This role will contribute to building a scalable, high-performance data lake and data architecture, integrating data from key sources across Operations, Engineering, Sales, Finance, and other IT partners.</p>
<p>As a Senior Product Manager for Data Infrastructure and Analytics, you will help drive data ingestion, transformation, governance, and analytics enablement. You will collaborate with engineering, analytics, finance, and business teams to help deliver data lake and pipeline orchestration solutions, ensuring accessible data for business insights.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Own and evangelize Data Platform and Business Analytics roadmap and strategy across CoreWeave</li>
<li>Assist with the execution of CoreWeave&#39;s enterprise data architecture, helping enable the data lake and domain-driven data layer</li>
<li>Support the development and enhancement of data ingestion, transformation, and orchestration pipelines for scalability, efficiency, and reliability</li>
<li>Work with the Engineering and Data teams to maintain and enhance data pipelines for both structured and unstructured data, enabling efficient data movement across the organization</li>
<li>Collaborate with Finance, GTM, Infrastructure, Data Center, and Supply Chain teams to help unify and model data from core systems (ERP, CRM, Asset Mgmt, Supply Chain systems, etc.)</li>
<li>Contribute to data governance and quality initiatives, focusing on data consistency, lineage tracking, and compliance with security standards</li>
<li>Support the BI and analytics layer by partnering with stakeholders to enable data products, dashboards, and reporting capabilities</li>
<li>Help prioritize data-driven initiatives, ensuring alignment with business goals and operational needs in coordination with leadership</li>
</ul>
<p>Requirements:</p>
<ul>
<li>5+ years of experience in data product management, data architecture, or enterprise data engineering roles</li>
<li>Familiarity with data lakes, data warehouses, ETL/ELT and streaming pipelines, and data governance frameworks</li>
<li>Hands-on experience with modern data stack technologies (such as Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka)</li>
<li>Understanding of data modeling, domain-driven design, and creating scalable data platforms</li>
<li>Experience supporting the end-to-end data product lifecycle, including requirements gathering and implementation</li>
<li>Strong collaboration skills with engineering, analytics, and business teams to help deliver data initiatives</li>
<li>Awareness of data security, compliance, and governance best practices</li>
<li>Understanding of BI and analytics platforms (such as Tableau, Looker, Power BI) and supporting self-service analytics</li>
</ul>
<p>Why CoreWeave?</p>
<p>At CoreWeave, we work hard, have fun, and move fast! We&#39;re in an exciting stage of hyper-growth that you will not want to miss out on. We&#39;re not afraid of a little chaos, and we&#39;re constantly learning. Our team cares deeply about how we build our product and how we work together, which is represented through our core values:</p>
<ul>
<li>Be Curious at Your Core</li>
<li>Act Like an Owner</li>
<li>Empower Employees</li>
<li>Deliver Best-in-Class Client Experiences</li>
<li>Achieve More Together</li>
</ul>
<p>We support and encourage an entrepreneurial outlook and independent thinking. We foster an environment that encourages collaboration and provides the opportunity to develop innovative solutions to complex problems. As we get set for take off, the growth opportunities within the organization are constantly expanding. You will be surrounded by some of the best talent in the industry, who will want to learn from you, too. Come join us!</p>
<p>Salary Range: $143,000 to $210,000</p>
<p>Benefits:</p>
<ul>
<li>Medical, dental, and vision insurance - 100% paid for by CoreWeave</li>
<li>Company-paid Life Insurance</li>
<li>Voluntary supplemental life insurance</li>
<li>Short and long-term disability insurance</li>
<li>Flexible Spending Account</li>
<li>Health Savings Account</li>
<li>Tuition Reimbursement</li>
<li>Ability to Participate in Employee Stock Purchase Program (ESPP)</li>
<li>Mental Wellness Benefits through Spring Health</li>
<li>Family-Forming support provided by Carrot</li>
<li>Paid Parental Leave</li>
<li>Flexible, full-service childcare support with Kinside</li>
<li>401(k) with a generous employer match</li>
<li>Flexible PTO</li>
<li>Catered lunch each day in our office and data center locations</li>
<li>A casual work environment</li>
<li>A work culture focused on innovative disruption</li>
</ul>
<p>Workplace:</p>
<p>While we prioritize a hybrid work environment, remote work may be considered for candidates located more than 30 miles from an office, based on role requirements for specialized skill sets. New hires will be invited to attend onboarding at one of our hubs within their first month. Teams also gather quarterly to support collaboration.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$143,000 to $210,000</Salaryrange>
      <Skills>data product management, data architecture, enterprise data engineering, data lakes, data warehouses, ETL/ELT and streaming pipelines, data governance frameworks, modern data stack technologies, Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka, data modeling, domain-driven design, scalable data platforms, BI and analytics platforms, Tableau, Looker, Power BI</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>CoreWeave</Employername>
      <Employerlogo>https://logos.yubhub.co/coreweave.com.png</Employerlogo>
      <Employerdescription>CoreWeave is a cloud-based platform that enables innovators to build and scale AI with confidence.</Employerdescription>
      <Employerwebsite>https://www.coreweave.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/coreweave/jobs/4649824006</Applyto>
      <Location>Livingston, NJ / New York, NY / Sunnyvale, CA / Bellevue, WA/San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>26f523c0-bbd</externalid>
      <Title>Resident Solutions Architect - Manufacturing</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect (RSA) on our Professional Services team, you will work with customers on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>The impact you will have:</p>
<ul>
<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues</li>
</ul>
<ul>
<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines</li>
</ul>
<ul>
<li>Documentation and white-boarding skills</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
</ul>
<ul>
<li>Ability to travel up to 30% when needed</li>
</ul>
<p>Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494154002</Applyto>
      <Location>Boston, Massachusetts</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>a84988d7-61a</externalid>
      <Title>Partner Solutions Architect, CEMEA</Title>
      <Description><![CDATA[<p>As a Partner Solutions Architect, you will engage with our top Consulting and System Integrator (C&amp;SI) Partners and Field Engineering to drive adoption of the Data Intelligence Platform in our top customers through C-Suite Technical Executive alignment, engagement of Champions, and collaboration with our sales teams in the region.</p>
<p>You will develop ongoing partner capability via the &#39;Technical Champions program&#39; within our top C&amp;SI Partners to support CoE creation and delivery excellence.</p>
<p>You will provide strategic vision related to the Databricks Data Intelligence Platform aligning to the GSI engagement in our top accounts and develop and support programs to promote Partner expertise in the application of the Databricks Data Intelligence Platform.</p>
<p>Reporting to the Director, Field Engineering (Partner Solutions Architect)</p>
<p>The impact you will have:</p>
<p>A Partner Solutions Architect plays a crucial role in the success of the Databricks partner ecosystem by ensuring partners have the technical knowledge and experience to build, maintain and grow successful solutions for their customers.</p>
<p>This, in turn, drives the adoption and success of the Databricks Data Intelligence Platform and the Partner solutions in the market.</p>
<p>To achieve this, the PSA will:</p>
<ul>
<li>Accelerate Partner pre-sales and delivery in joint, strategic customer accounts by aligning Partner and Databricks Resources and providing technical expertise to accelerate adoption and consumption of the platform</li>
</ul>
<ul>
<li>Work closely with Databricks account teams to help our partner ecosystem scope, evaluate and deliver large scale data projects and transformational programmes</li>
</ul>
<ul>
<li>Grow the Partner Databricks delivery capability by providing technical expertise to help design, build and maintain repeatable solutions using Databricks Products and Services</li>
</ul>
<ul>
<li>Develop, maintain and grow Senior Technical Executive relationships to identify new business opportunities, innovative use cases, competition and support joint go to market initiatives.</li>
</ul>
<ul>
<li>The role requires up to 40% travel to GSI Partner sites and the Databricks German offices.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Extensive experience hands-on as a Data Professional in a modern cloud-based data stack</li>
</ul>
<ul>
<li>At least 3 years experience with technical pre-sales and sales methodologies within a consumption business model</li>
</ul>
<ul>
<li>Collaborate closely with partner organisations at the senior executive level to understand their needs and objectives and align to Databricks products and Services</li>
</ul>
<ul>
<li>Conduct training sessions, workshops, and webinars to educate partners on new technologies, features, and best practices.</li>
</ul>
<ul>
<li>Develop and maintain technical content such as whitepapers, case studies, and solution guides to assist partners in leveraging the Databricks offerings</li>
</ul>
<ul>
<li>Prior experience with coding in a core programming language (i.e., Python, Java, Scala)</li>
</ul>
<ul>
<li>Designing, implementing and maintaining end to end Data Architectures for Big Data, Data Warehousing and AI on MPP based platforms</li>
</ul>
<ul>
<li>Managing multiple, frequently changing priorities across multiple teams</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Intelligence Platform, Cloud-based data stack, Technical pre-sales and sales methodologies, Partner organisations, Senior executive level, New technologies, Features, Best practices, Core programming language, Python, Java, Scala, Data Architectures, Big Data, Data Warehousing, AI, MPP based platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8413308002</Applyto>
      <Location>Switzerland</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7e58a91c-29e</externalid>
      <Title>Strategic Alliance Operations Director</Title>
      <Description><![CDATA[<p>The Strategic Alliance Operations Director will lead the Accenture Databricks Business Group (ADBG) and build the operating model that powers one of Databricks&#39; most strategic global partnerships.</p>
<p>In this role, you will own the governance, portfolio management, and execution cadence for the ADBG, while also overseeing a focused strategic investment program that accelerates joint innovation with Accenture.</p>
<p>As the Strategic Alliance Operations Director, your primary mission is to design, launch, and scale the global operating framework that ensures Databricks and Accenture execute consistently, predictably, and with clear accountability across all joint initiatives.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Establishing, operationalizing, and programmatizing the Accenture BG, including governance structures, operating cadence, and delivery standards for all joint Databricks-Accenture initiatives.</li>
</ul>
<ul>
<li>Maintaining an end-to-end, integrated portfolio view covering pipeline, active programs, dependencies, financial performance, delivery health, and risks.</li>
</ul>
<ul>
<li>Leading executive and operational cadences (QBRs, steering committees, exec reviews) to drive decisions, resolve escalations, and ensure alignment across Databricks and Accenture stakeholders.</li>
</ul>
<ul>
<li>Developing and maintaining KPI dashboards and reporting that provide clear visibility into revenue, utilization, program status, delivery quality, and customer outcomes.</li>
</ul>
<ul>
<li>Defining and rolling out standardized tools, templates, and best practices for planning, tracking, governance, and reporting, enabling consistent, scalable execution across regions and industries.</li>
</ul>
<ul>
<li>Working cross-functionally (C&amp;SI, ISV Ecosystem, GTM Strategy &amp; Ops, Business Strategy &amp; Ops, Industry, Legal, Finance, Product) to define a clear strategic investment framework that dovetails with PMO governance, including pillars, guardrails, and decision criteria.</li>
</ul>
<ul>
<li>Ensuring that all investments are aligned with Databricks&#39; evolving strategic goals and the broader Accenture BG portfolio, structured with clear milestones, owners, and success metrics that plug into PMO tracking and reporting, operationalized through existing PMO processes, and fully auditable with documented decisions and consolidated reporting via PMO dashboards.</li>
</ul>
<ul>
<li>Partnering with Accenture leadership and field teams to source, qualify, and shape investment opportunities, using standardized proposal formats that enable apples-to-apples evaluation by Databricks leadership.</li>
</ul>
<p>We look for an experienced Alliance Operations leader who is also a strategic builder: someone who anchors on operational excellence but is energized by shaping new, high-impact investment mechanisms within that structure.</p>
<p>Key qualifications include:</p>
<ul>
<li>Strong PMO and portfolio management leadership skills, with a track record of establishing governance and operating models in complex, multi-stakeholder environments.</li>
</ul>
<ul>
<li>Highly effective at building and leading cross-functional virtual teams across GTM, Finance, Product, Legal, and Partner organizations.</li>
</ul>
<ul>
<li>Exceptional ability to design and orchestrate processes, drive consensus across organizations, and resolve impasses while keeping execution on track.</li>
</ul>
<ul>
<li>Skilled negotiator with experience structuring and closing binding partner agreements that align strategic goals, risk, and return.</li>
</ul>
<ul>
<li>Strategic, analytical mindset with a strong bias for action; able to move quickly while maintaining rigor, transparency, and auditability.</li>
</ul>
<ul>
<li>15+ years of experience in program/portfolio management, partner operations, or strategic investments in hyper-growth or large-scale technology environments.</li>
</ul>
<ul>
<li>Demonstrated success standing up and leading PMO or portfolio functions for complex, global, multi-stakeholder initiatives.</li>
</ul>
<ul>
<li>Proven track record forming and leading cross-functional v-teams (GTM, Finance, Product, ISV/partner, Legal, Operations).</li>
</ul>
<ul>
<li>Experience negotiating and closing binding contracts with partners, including GSIs, RSIs, and ISVs.</li>
</ul>
<ul>
<li>Strong technical understanding of the Databricks product portfolio and modern cloud/data architectures.</li>
</ul>
<p>Pay Range Transparency: Databricks is committed to fair and equitable compensation practices. The pay range for this role is $143,700-$197,550 USD.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$143,700-$197,550 USD</Salaryrange>
      <Skills>Portfolio Management, Governance, Process Design, Cross-Functional Leadership, Negotiation, Strategic Planning, Analytical Thinking, Program Management, Partner Operations, Strategic Investments, Apache Spark, Delta Lake, MLflow, Cloud/Data Architectures</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8439170002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3d57b93e-423</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have:</p>
<ul>
<li>Databricks Certification</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, data architecture, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456948002</Applyto>
      <Location>Atlanta, Georgia</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c7ba4251-36b</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>Job Title: Resident Solutions Architect - Public Sector</p>
<p>We are seeking a highly skilled Resident Solutions Architect to join our Professional Services team in Washington, D.C. As a Resident Solutions Architect, you will work with customers on short to medium-term customer engagements on their big data challenges using the Databricks platform.</p>
<p>Responsibilities:</p>
<ul>
<li>Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>
<li>Provide an escalated level of support for customer operational issues</li>
<li>Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>Requirements:</p>
<ul>
<li>US Top Secret Clearance Required this position</li>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines</li>
<li>Documentation and white-boarding skills</li>
<li>Experience working with clients and managing conflicts</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
<li>Bachelor&#39;s degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience</li>
<li>Ability to travel up to 30% when needed</li>
</ul>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD Zone 2 Pay Range $180,656-$248,360 USD Zone 3 Pay Range $180,656-$248,360 USD Zone 4 Pay Range $180,656-$248,360 USD</p>
<p>About Databricks</p>
<p>Databricks is the data and AI company. More than 10,000 organizations worldwide , including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 , rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.</p>
<p>Benefits</p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.</p>
<p>Our Commitment to Diversity and Inclusion</p>
<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.</p>
<p>Compliance</p>
<p>If access to export-controlled technology or source code is required for performance of job duties, it is within Employer&#39;s discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis aloneabled</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, scope and timelines, documentation and white-boarding, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8356289002</Applyto>
      <Location>Washington, D.C.</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>cbd81d47-d7e</externalid>
      <Title>Data Platform Solutions Architect (Professional Services)</Title>
      <Description><![CDATA[<p>We&#39;re hiring for multiple roles within our Professional Services team. This position may be offered as Senior Solutions Consultant, Resident Solutions Architect, or Senior Resident Solutions Architect. The final title will align to your experience, technical depth, and customer-facing ownership.</p>
<p>As a Big Data Solutions Architect (Internal Title - Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Extensive experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 10% of the time</li>
</ul>
<p>[Preferred] Databricks Certification but not essential</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8486738002</Applyto>
      <Location>London, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>219928ef-6de</externalid>
      <Title>Resident Solutions Architect - Healthcare &amp; Life Sciences</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494148002</Applyto>
      <Location>Philadelphia, Pennsylvania</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0c456364-565</externalid>
      <Title>Delivery Solutions Architect</Title>
      <Description><![CDATA[<p>As a Delivery Solutions Architect at Databricks, you will be a trusted technical advisor embedded within the customer organisation. You will work closely with sales and field engineering to accelerate adoption and growth of the Databricks platformقت You will ensure customer success by providing technical accountability for our most complex customers,helping them maximise the value of Databricks workloads they have already selected and improving their return on investment.</p>
<p>This role blends deep technical leadership with strategic customer engagement. You will own the post-sales technical strategy for the customer’s highest-value use cases and serve as their primary advisor across the Databricks platform.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Being the accountable Databricks Architect for your assigned customers, working with technical teams to guide priority use cases from design through go-live,removing blockers, providing best practices, and ensuring stable, scalable adoption.</li>
<li>Leading the post-technical-win strategy and execution plan for major Databricks use cases, aligning with Solutions Architects to understand full demand plans and drive clarity across multiple selling teams and stakeholders.</li>
<li>Owning the technical leadership of assigned use cases, creating certainty from ambiguity and coordinating onboarding, enablement, success, go-live, and healthy consumption of workloads selected for Databricks.</li>
<li>Serving as the first point of contact for production/go-live status, often across multiple complex use cases within large enterprise organisations.</li>
<li>Orchestrating the broader Databricks ecosystem,Shared Services, User Education, Onboarding/Technical Services, Support, and specialist technical teams,to ensure high-quality delivery and escalate advanced issues when needed.</li>
<li>Creating and executing a point of view for accelerating use cases into production, collaborating with Professional Services on proposals as needed.</li>
<li>Partnering with Product and Engineering to introduce new capabilities, private previews, and upgrade paths that support customer roadmaps.</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>Programming experience in Python, SQL, or Scala, and a solid understanding of distributed data systems.</li>
<li>5+ years of experience delivering Data, Analytics, or AI projects, with the ability to contribute to architectural discussions with customers.</li>
<li>Experience in customer-facing technical roles such as technical architecture, pre-sales, consulting, or customer success.</li>
<li>Ability to guide architectural decisions in domains such as data engineering, data architecture, data warehousing, or data science.</li>
<li>Demonstrated ability to drive delivery outcomes without hands-on keyboard responsibilities.</li>
<li>Experience resolving complex escalations with senior customer stakeholders.</li>
<li>Understanding of how to connect technical deliverables to business value.</li>
<li>Track record of achieving or exceeding goals or objectives.</li>
<li>Bachelor’s degree in Computer Science, Information Systems, Engineering, or equivalent experience.</li>
<li>Fluency in English is required; French or German language skills are a plus.</li>
<li>Ability to travel up to 30%.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, SQL, Scala, Distributed data systems, Data engineering, Data architecture, Data warehousing, Data science</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a data intelligence platform to unify and democratize data, analytics, and AI. It has over 10,000 customers worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8309177002</Applyto>
      <Location>Zürich, Switzerland</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8efd6b3b-251</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456973002</Applyto>
      <Location>Boston, Massachusetts</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6d94d7ea-9ca</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
</ul>
<ul>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have: Databricks Certification</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, design and deployment of highly performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461330002</Applyto>
      <Location>Washington, D.C.</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8664b981-66c</externalid>
      <Title>Data Platform Solutions Architect (Professional Services) - Emerging Enterprise &amp; DNB</Title>
      <Description><![CDATA[<p>We&#39;re hiring for multiple roles within our Professional Services team. Depending on experience and scope, this position may be offered as a Senior Solutions Consultant or a Resident Solutions Architect. You may know this role as a Big Data Solutions Architect, Analytics Architect, Data Platform Architect, or Technical Consultant. The final title will align to your experience, technical depth, and customer-facing ownership.</p>
<p>As a Data Platform Solutions Architect on our Professional Services team for the Emerging Enterprise &amp; Digital Natives business in EMEA, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Drive high-impact customer projects: Design and build reference architectures, implement production use cases, and create “how-to” guides tailored to the unique needs of fast-moving Emerging Enterprise &amp; Digital Native customers in EMEA.</li>
</ul>
<ul>
<li>Collaborate on project scoping: Work closely with Engagement Managers and customers to define project scope, schedules, and deliverables for professional services engagements.</li>
</ul>
<ul>
<li>Enable transformational initiatives: Guide strategic customers through their end-to-end big data journeys,migrating from legacy platforms and deploying industry-leading data and AI applications on the Databricks platform.</li>
</ul>
<ul>
<li>Consult on architecture &amp; design: Provide thought leadership on solution design and implementation strategies, ensuring customers can successfully evaluate and adopt Databricks.</li>
</ul>
<ul>
<li>Offer advanced support: Serve as an escalation point for operational issues, collaborating with Databricks Support and Engineering to resolve challenges quickly.</li>
</ul>
<ul>
<li>Align technical delivery: Partner with cross-functional Databricks teams (Technical, PM, Architecture, and Customer Success) to align on milestones, ensuring customer needs and deadlines are met.</li>
</ul>
<ul>
<li>Amplify product feedback: Provide implementation insights to Databricks Product and Support teams, guiding rapid improvements in features and troubleshooting for customers.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Extensive experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 10% of the time</li>
</ul>
<ul>
<li>[Preferred] Databricks Certification but not essential</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8439047002</Applyto>
      <Location>London, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>2afc821d-248</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494149002</Applyto>
      <Location>Philadelphia, Pennsylvania</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>34a0bf55-11a</externalid>
      <Title>Resident Solutions Architect - Communications, Media, Entertainment &amp; Games</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines</li>
<li>Documentation and white-boarding skills</li>
<li>Experience working with clients and managing conflicts</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461222002</Applyto>
      <Location>Boston, Massachusetts</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7d723067-22d</externalid>
      <Title>Resident Solutions Architect - Healthcare &amp; Life Sciences</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494144002</Applyto>
      <Location>Dallas, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7e5c6f46-bb6</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456975002</Applyto>
      <Location>Dallas, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>b4a461d1-b6b</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a company that provides a data and AI platform. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494128002</Applyto>
      <Location>Washington, D.C.</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>32d8d11d-9dc</externalid>
      <Title>Resident Solutions Architect - Healthcare &amp; Life Sciences</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8371312002</Applyto>
      <Location>New York City, New York</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3e92e8a2-811</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494130002</Applyto>
      <Location>New York City, New York</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>9d5fcc78-b2b</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification Pay Range Transparency Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipated utilizing the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data science, cloud technology, Apache Spark, CI/CD, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Python, Scala, AWS, Azure, GCP, distributed computing, Spark runtime internals</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8423296002</Applyto>
      <Location>Central - United States; Northeast - United States; Southeast - United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3dac7f19-1df</externalid>
      <Title>Enterprise Account Executive - Bay Area</Title>
      <Description><![CDATA[<p>We&#39;re looking for a high-energy Enterprise Account Executive to drive net-new revenue and expansion within strategic Enterprise accounts. You&#39;ll be the owner of a defined territory where you&#39;ll build your own pipeline, tell the Elastic Search AI story, and close complex, multi-stakeholder deals in a consumption-based model.</p>
<p>This role sits at the intersection of sales execution, technical fluency, and cross-functional collaboration,and is critical to our growth in the Enterprise segment.</p>
<p>Key Responsibilities:</p>
<p>Own your territory &amp; build pipeline: Develop and execute a proactive outbound cadence (email, call, social) that generates ≥50 % of your booked opportunities.</p>
<p>Deep discovery &amp; qualification: Uncover pain, business impact, budget, and decision criteria using frameworks like MEDDPICC so you chase only the highest-confidence deals.</p>
<p>Value storytelling &amp; demos: Craft and deliver tailored narratives and live demos that map Elastic’s Search, Observability, and Security capabilities to measurable business outcomes.</p>
<p>Mutual deal strategy &amp; forecast accuracy: Collaborate with customers to build formal close plans and keep your CRM up-to-date, maintaining ≥90 % forecast accuracy within ±10 %.</p>
<p>Executive negotiation &amp; closing: Lead high-stakes contract and pricing discussions,defend your value, structure give/get trades, and land multi-year consumption commitments.</p>
<p>Domain &amp; cloud acumen: Position Elastic as the Search AI platform of choice by speaking fluently about cloud economics, usage-based pricing, and modern data architectures.</p>
<p>Cross-functional partnership: Work hand-in-glove with Solutions Architects, Customer Success, Marketing, and RevOps to accelerate deals and drive exceptional customer outcomes.</p>
<p>Requirements:</p>
<p>Proven SaaS quota-carrying success: 5+ years closing complex Enterprise deals, consistently overachieving targets in a consumption-based or usage-model environment.</p>
<p>Expert discovery &amp; qualification skills: Demonstrated ability to apply MEDDPICC or equivalent frameworks to drive disciplined pipeline and eliminate low-probability deals.</p>
<p>Compelling value storytellers: Track record of delivering executive-level presentations and demos that tie product capabilities to real dollars saved, revenue gained, or risk mitigated.</p>
<p>Strong negotiation chops: History of landing multi-year, high-ACV contracts while protecting margin and securing executive stakeholder buy-in.</p>
<p>Technical &amp; cloud fluency: Comfortable discussing a broad range of technical topics including observability, security, vector/traditional search, and cloud cost optimization.</p>
<p>Collaborative mindset &amp; coachability: A learner who partners effectively with internal teams, incorporates feedback, and embodies Elastic’s values of community and openness.</p>
<p>Open Source enthusiasm: Genuine appreciation for open-source communities and the Elastic model,bonus if you’ve sold or advocated in an OSS context.</p>
<p>Bonus Points:</p>
<p>Prior experience at an open-source or developer-centric infrastructure company.</p>
<p>Familiarity with observability (logs, metrics, traces) or security analytics (SIEM/XDR) use cases.</p>
<p>If you’re driven to build your own pipeline, master complex deal cycles, and help customers unlock the power of Search AI, we’d love to talk. Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$113,300-$179,200 USD</Salaryrange>
      <Skills>SaaS, Enterprise sales, Cloud economics, Usage-based pricing, Modern data architectures, Observability, Security, Vector/traditional search, Cloud cost optimization, Open-source communities, Elastic model, OSS context</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Elastic</Employername>
      <Employerlogo>https://logos.yubhub.co/elastic.co.png</Employerlogo>
      <Employerdescription>Elastic is a Search AI company that enables everyone to find the answers they need in real time, using all their data, at scale.</Employerdescription>
      <Employerwebsite>https://www.elastic.co/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/elastic/jobs/7554566</Applyto>
      <Location>San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>4cd630c8-77d</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>Job Title: Resident Solutions Architect - Public Sector</p>
<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers get the most value out of their data.</p>
<p>Responsibilities:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope various professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build, and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap or implement customer projects which lead to a customer&#39;s successful understanding, evaluation, and adoption of Databricks</li>
<li>Provide an escalated level of support for customer operational issues</li>
<li>Work with the Databricks technical team, Project Manager, Architect, and Customer team to ensure the technical components of the engagement are delivered to meet customer needs</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement-specific product and support issues</li>
</ul>
<p>Requirements:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines</li>
<li>Documentation and white-boarding skills</li>
<li>Experience working with clients and managing conflicts</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Pay Range Transparency:</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range for this role is $180,656-$248,360 USD.</p>
<p>About Databricks:</p>
<p>Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics, and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</p>
<p>Benefits:</p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, click here.</p>
<p>Our Commitment to Diversity and Inclusion:</p>
<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation, white-boarding</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform. The company was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494137002</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8d8b3af4-285</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
</ul>
<ul>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
</ul>
<ul>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
</ul>
<ul>
<li>Consult on architecture and design; bootstrap or implement customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
</ul>
<ul>
<li>Provide an escalated level of support for customer operational issues.</li>
</ul>
<ul>
<li>You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
</ul>
<ul>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
</ul>
<ul>
<li>Comfortable writing code in either Python or Scala</li>
</ul>
<ul>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
</ul>
<ul>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
</ul>
<ul>
<li>Familiarity with CI/CD for production deployments</li>
</ul>
<ul>
<li>Working knowledge of MLOps</li>
</ul>
<ul>
<li>Design and deployment of performant end-to-end data architectures</li>
</ul>
<ul>
<li>Experience with technical project delivery - managing scope and timelines.</li>
</ul>
<ul>
<li>Documentation and white-boarding skills.</li>
</ul>
<ul>
<li>Experience working with clients and managing conflicts.</li>
</ul>
<ul>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.</li>
</ul>
<ul>
<li>Travel to customers 20% of the time</li>
</ul>
<p>Databricks Certification</p>
<p>Pay Range Transparency</p>
<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected base salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>
<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>
<p>Based on the factors above, Databricks anticipated utilizing the full width of the range.</p>
<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>
<p>For more information regarding which range your location is in visit our page here.</p>
<p>Zone 1 Pay Range $180,656-$248,360 USD</p>
<p>Zone 2 Pay Range $180,656-$248,360 USD</p>
<p>Zone 3 Pay Range $180,656-$248,360 USD</p>
<p>Zone 4 Pay Range $180,656-$248,360 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems, Apache Spark, CI/CD, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8494147002</Applyto>
      <Location>Atlanta, Georgia</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>6fed2bb6-3b6</externalid>
      <Title>Resident Solutions Architect - Public Sector</Title>
      <Description><![CDATA[<p>As a Resident Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>Your responsibilities will include:</p>
<ul>
<li>Designing and building reference architectures for customers</li>
<li>Creating how-to&#39;s and productionalizing customer use cases</li>
<li>Guiding strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consulting on architecture and design; bootstrapping or implementing customer projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks</li>
<li>Providing an escalated level of support for customer operational issues</li>
<li>Working with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs</li>
<li>Working with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues</li>
</ul>
<p>To be successful in this role, you will need:</p>
<ul>
<li>6+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Design and deployment of performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines</li>
<li>Documentation and white-boarding skills</li>
<li>Experience working with clients and managing conflicts</li>
<li>Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects</li>
<li>Travel to customers 20% of the time</li>
</ul>
<p>The pay range for this role is $180,656-$248,360 USD per year, depending on location and experience.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD per year</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461321002</Applyto>
      <Location>Chicago, Illinois</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>d0793a44-d91</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
</ul>
<p>Nice to have:</p>
<ul>
<li>Databricks Certification</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8461328002</Applyto>
      <Location>Charlotte, North Carolina</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>5fd85b1e-563</externalid>
      <Title>Resident Solutions Architect - Financial Services</Title>
      <Description><![CDATA[<p>As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.</p>
<p>You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.</p>
<p>RSAs are billable and know how to complete projects according to specification with excellent customer service.</p>
<p>You will report to the regional Manager/Lead.</p>
<p>The impact you will have:</p>
<ul>
<li>Work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to&#39;s and productionalizing customer use cases</li>
<li>Work with engagement managers to scope variety of professional services work with input from the customer</li>
<li>Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications</li>
<li>Consult on architecture and design; bootstrap hands-on projects which leads to a customers&#39; successful understanding, evaluation and adoption of Databricks.</li>
<li>Provide an escalated level of support for customer operational issues.</li>
<li>Work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer&#39;s needs.</li>
<li>Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>9+ years experience in data engineering, data platforms &amp; analytics</li>
<li>Comfortable writing code in either Python or Scala</li>
<li>Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one</li>
<li>Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals</li>
<li>Familiarity with CI/CD for production deployments</li>
<li>Working knowledge of MLOps</li>
<li>Capable of design and deployment of highly performant end-to-end data architectures</li>
<li>Experience with technical project delivery - managing scope and timelines.</li>
<li>Documentation and white-boarding skills.</li>
<li>Experience working with clients and managing conflicts.</li>
<li>Experience in building scalable streaming and batch solutions using cloud-native components</li>
<li>Travel to customers up to 20% of the time</li>
<li>Nice to have: Databricks Certification</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,656-$248,360 USD</Salaryrange>
      <Skills>data engineering, data platforms &amp; analytics, Python, Scala, Cloud ecosystems (AWS, Azure, GCP), Apache Spark, CI/CD for production deployments, MLOps, design and deployment of highly performant end-to-end data architectures, technical project delivery, documentation and white-boarding skills, client management, Databricks Certification</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8456965002</Applyto>
      <Location>Dallas, Texas</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>35d691cd-c56</externalid>
      <Title>Partner Solutions Architect, CEMEA</Title>
      <Description><![CDATA[<p>As a Partner Solutions Architect, you will engage with our top Consulting and System Integrator (C&amp;SI) Partners and Field Engineering to drive adoption of the Data Intelligence Platform in our top customers through C-Suite Technical Executive alignment, engagement of Champions, and collaboration with our sales teams in the region.</p>
<p>You will develop ongoing partner capability via the &#39;Technical Champions program&#39; within our top C&amp;SI Partners to support CoE creation and delivery excellence.</p>
<p>You will provide strategic vision related to the Databricks Data Intelligence Platform aligning to the GSI engagement in our top accounts and develop and support programs to promote Partner expertise in the application of the Databricks Data Intelligence Platform.</p>
<p>Reporting to the Director, Field Engineering (Partner Solutions Architect)</p>
<p>The impact you will have:</p>
<p>A Partner Solutions Architect plays a crucial role in the success of the Databricks partner ecosystem by ensuring partners have the technical knowledge and experience to build, maintain and grow successful solutions for their customers.</p>
<p>This, in turn, drives the adoption and success of the Databricks Data Intelligence Platform and the Partner solutions in the market.</p>
<p>To achieve this, the PSA will:</p>
<ul>
<li>Accelerate Partner pre-sales and delivery in joint, strategic customer accounts by aligning Partner and Databricks Resources and providing technical expertise to accelerate adoption and consumption of the platform</li>
</ul>
<ul>
<li>Work closely with Databricks account teams to help our partner ecosystem scope, evaluate and deliver large scale data projects and transformational programmes</li>
</ul>
<ul>
<li>Grow the Partner Databricks delivery capability by providing technical expertise to help design, build and maintain repeatable solutions using Databricks Products and Services</li>
</ul>
<ul>
<li>Develop, maintain and grow Senior Technical Executive relationships to identify new business opportunities, innovative use cases, competition and support joint go to market initiatives.</li>
</ul>
<ul>
<li>The role requires up to 40% travel to GSI Partner sites and the Databricks German offices.</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Extensive experience hands-on as a Data Professional in a modern cloud-based data stack</li>
</ul>
<ul>
<li>At least 3 years experience with technical pre-sales and sales methodologies within a consumption business model</li>
</ul>
<ul>
<li>Collaborate closely with partner organisations at the senior executive level to understand their needs and objectives and align to Databricks products and Services</li>
</ul>
<ul>
<li>Conduct training sessions, workshops, and webinars to educate partners on new technologies, features, and best practices.</li>
</ul>
<ul>
<li>Develop and maintain technical content such as whitepapers, case studies, and solution guides to assist partners in leveraging the Databricks offerings</li>
</ul>
<ul>
<li>Prior experience with coding in a core programming language (i.e., Python, Java, Scala)</li>
</ul>
<ul>
<li>Designing, implementing and maintaining end to end Data Architectures for Big Data, Data Warehousing and AI on MPP based platforms</li>
</ul>
<ul>
<li>Managing multiple, frequently changing priorities across multiple teams</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Intelligence Platform, Cloud-based data stack, Technical pre-sales and sales methodologies, Partner organisations, Senior executive level, Coding in Python, Java, Scala, Data Architectures for Big Data, Data Warehousing and AI</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8407925002</Applyto>
      <Location>Berlin, Germany; Germany; Munich, Germany</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>9c8f210e-400</externalid>
      <Title>Senior AI Deployment Strategist</Title>
      <Description><![CDATA[<p>About Mistral At Mistral AI, we believe in the power of AI to simplify tasks, save time, and enhance learning and creativity. Our technology is designed to integrate seamlessly into daily working life. We democratize AI through high-performance, optimized, open-source and cutting-edge models, products and solutions. Our comprehensive AI platform is designed to meet enterprise needs, whether on-premises or in cloud environments. Our offerings include le Chat, the AI assistant for life and work.</p>
<p>Role Summary : Senior AI Deployment Strategist As a Senior AI Deployment Strategist, you will bridge the gap between vision and execution, ensuring our customers not only see the potential of AI but realize its value in their operations. This role is a unique blend of strategic advisory and hands-on deployment leadership, spanning both presales and postsales phases. You will act as a trusted advisor to C-suite executives, architecting AI solutions that drive transformation and delivering measurable business outcomes. You will be embedded in our most strategic accounts, diagnosing business challenges, designing AI-powered solutions, and leading their deployment from concept to full-scale adoption. This role demands a rare combination of executive presence, technical credibility, and commercial acumen, with a focus on both winning and delivering high-impact AI initiatives.</p>
<p>What you will do</p>
<p>Strategic Advisory &amp; C-Suite Partnership • Serve as the lead strategic advisor for a portfolio of high-value enterprise clients, building trusted relationships with executive leadership (CEO, CTO, CIO). • Lead C-suite workshops to diagnose business drivers and co-create multi-year AI transformation roadmaps aligned with corporate strategy. • Develop and present compelling business cases and proposals, articulating the ROI of AI adoption and the value of Mistral’s platform.</p>
<p>Presales: Shaping the Vision • Partner with sales and product teams to identify and qualify opportunities, positioning Mistral’s solutions as the catalyst for enterprise transformation. • Design tailored AI strategies and proof-of-concepts that address critical business challenges and demonstrate tangible value. • Act as a subject matter expert in client engagements, ensuring our solutions are understood, trusted, and adopted.</p>
<p>Postsales: Driving Deployment &amp; Adoption • Own the end-to-end success of AI deployments, from strategic planning to operational integration, ensuring solutions are embedded into the customer’s core workflows. • Lead cross-functional teams of engineers, data scientists, and product managers to execute deployment roadmaps and deliver measurable results. • Navigate organizational complexity and drive change management to ensure seamless adoption and long-term success.</p>
<p>Commercial Growth &amp; Thought Leadership • Identify and cultivate expansion opportunities within accounts, connecting Mistral’s capabilities to new business challenges and driving commercial growth. • Mentor junior strategists and contribute to the development of best practices, methodologies, and playbooks for customer engagement. • Represent Mistral AI as a thought leader through speaking engagements, executive briefings, and industry contributions.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>AI/ML concepts, Enterprise data architecture, Modern software development, Hands-on development of AI solutions, C-suite executive engagement, Complex conversation navigation, Business objective alignment, Ambiguous problem structuring, Actionable program development, High-stakes environment management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mistral AI</Employername>
      <Employerlogo>https://logos.yubhub.co/mistral.ai.png</Employerlogo>
      <Employerdescription>Mistral AI is an AI technology company that offers high-performance, optimized, open-source and cutting-edge models, products and solutions for enterprise needs.</Employerdescription>
      <Employerwebsite>https://mistral.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/mistral/0004f890-99d5-47c5-bb67-8f3f76a1e08f</Applyto>
      <Location>Paris</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>b635ac27-bac</externalid>
      <Title>Product Manager - Accounting</Title>
      <Description><![CDATA[<p>We&#39;re hiring a Product Manager who deeply understands accounting to own the Accounting &amp; Finance Automation pillar within our Finance Operating System. This role will define how global finance teams manage corporate spend data, maintain financial accuracy, integrate with their existing systems, and handle compliance,eliminating hundreds of hours of manual work each month.</p>
<p>You will own the product experience for accountants, controllers, and finance teams who use Jeeves to manage corporate spend across 20+ countries. This requires deep knowledge of accounting principles (GAAP/IFRS), how finance organizations operate, multi-jurisdiction compliance requirements, and the financial workflows that matter most to our customers.</p>
<p>You will combine accounting expertise with product management craft,using your understanding of how finance teams work to identify high-impact problems, translate complex requirements into elegant product solutions, and leverage AI to transform manual processes into automated workflows that accountants trust.</p>
<p>Because our customers operate across LatAm, the US, and Europe,with diverse organizational structures, cross-border operations, and varying regulatory requirements,we strongly prefer candidates based in Mexico, Colombia, or Brazil who understand these regional nuances.</p>
<p>Location: This role is based out of Mexico City, Mexico, and is a full-time remote position where it is also possible to come into our office in Roma Norte on a flexible schedule.</p>
<p><strong>Accounting Domain Expertise:</strong></p>
<ul>
<li>Own the accounting user experience end-to-end: Design products that accountants, controllers, and finance teams use daily to manage spend data, maintain financial accuracy, and ensure compliance.</li>
<li>Deeply understand accounting workflows: How finance teams process transactions, manage data flows, ensure accuracy, and meet regulatory requirements.</li>
<li>Navigate multi-jurisdiction complexity: Design for customers operating across different countries, each with unique regulatory frameworks, compliance requirements, and business practices.</li>
<li>Champion data accuracy and audit-ability: Every feature you build must maintain proper audit trails, ensure data integrity, and support compliance requirements.</li>
</ul>
<p><strong>AI-Powered Automation:</strong></p>
<ul>
<li>Identify high-impact automation opportunities: Find where finance teams lose hours to repetitive, manual work, and determine where AI can transform workflows.</li>
<li>Balance AI capabilities with accounting precision: Understand when AI-powered automation is appropriate vs. when deterministic rules are required.</li>
<li>Prototype and validate AI solutions: Personally test AI-powered approaches to validate whether they actually work before committing engineering resources.</li>
<li>Ship AI features that accountants trust: Build transparency into AI decisions, enable easy overrides, and maintain human-in-the-loop workflows where precision matters.</li>
</ul>
<p><strong>System Integration &amp; Data Architecture:</strong></p>
<ul>
<li>Own the integration strategy: Define how Jeeves connects with ERP systems, accounting software, and other platforms that finance teams rely on.</li>
<li>Design for flexibility and configuration: Build systems that adapt to different customer setups, organizational structures, and business requirements.</li>
<li>Ensure data integrity end-to-end: Every transaction must reconcile.</li>
</ul>
<p><strong>Customer Discovery &amp; Requirements Gathering:</strong></p>
<ul>
<li>Be the expert on customer needs: Conduct regular interviews with controllers, accountants, finance managers, and CFOs to understand pain points in their current workflows.</li>
<li>Partner with customer finance teams: Shadow their processes, understand their requirements, and learn the nuances of how they work.</li>
<li>Convert pain into product vision: Transform qualitative feedback and workflow observations into clear product opportunities.</li>
</ul>
<p><strong>Data Analysis &amp; Insight Generation:</strong></p>
<ul>
<li>Track adoption and impact metrics: Measure how customers use the product, where they struggle, and what drives value.</li>
<li>Analyze workflow data: Identify patterns in customer behavior, common pain points, and opportunities for improvement.</li>
<li>Build business cases: Quantify the ROI of product investments,hours saved for customers, error reduction, improved workflows, and business impact.</li>
</ul>
<p><strong>Strategy &amp; Vision:</strong></p>
<ul>
<li>Define the product roadmap: Build a clear point of view on where the market is heading, how customer needs are evolving, and how Jeeves can lead the way.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>accounting principles (GAAP/IFRS), financial workflows, multi-jurisdiction compliance requirements, AI-powered automation, data architecture, integration strategy, customer discovery, requirements gathering, data analysis, insight generation</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Jeeves</Employername>
      <Employerlogo>https://logos.yubhub.co/jeeves.com.png</Employerlogo>
      <Employerdescription>Jeeves is a financial operating system built for global businesses that provides corporate cards, cross-border payments, and spend management software within one unified platform. It operates across 20+ countries and serves over 5,000 clients.</Employerdescription>
      <Employerwebsite>https://www.jeeves.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/tryjeeves/83ca9734-ed23-47d0-8a79-2f3b9779a96c</Applyto>
      <Location>Mexico City</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>41bea01f-f31</externalid>
      <Title>Product Manager - Accounting</Title>
      <Description><![CDATA[<p>We&#39;re hiring a Product Manager who deeply understands accounting to own the Accounting &amp; Finance Automation pillar within our Finance Operating System. This role will define how global finance teams manage corporate spend data, maintain financial accuracy, integrate with their existing systems, and handle compliance,eliminating hundreds of hours of manual work each month.</p>
<p>You will own the product experience for accountants, controllers, and finance teams who use Jeeves to manage corporate spend across 20+ countries. This requires deep knowledge of accounting principles (GAAP/IFRS), how finance organizations operate, multi-jurisdiction compliance requirements, and the financial workflows that matter most to our customers.</p>
<p>You will combine accounting expertise with product management craft,using your understanding of how finance teams work to identify high-impact problems, translate complex requirements into elegant product solutions, and leverage AI to transform manual processes into automated workflows that accountants trust.</p>
<p>Because our customers operate across LatAm, the US, and Europe,with diverse organizational structures, cross-border operations, and varying regulatory requirements,we strongly prefer candidates based in Mexico, Colombia, or Brazil who understand these regional nuances.</p>
<p>Location: This role is based out of São Paulo, Brazil, and is a full-time remote position where it is also possible to come into our office at complexo JK Iguatemi on a flexible schedule.</p>
<p><strong>Accounting Domain Expertise:</strong></p>
<ul>
<li>Own the accounting user experience end-to-end: Design products that accountants, controllers, and finance teams use daily to manage spend data, maintain financial accuracy, and ensure compliance.</li>
<li>Deeply understand accounting workflows: How finance teams process transactions, manage data flows, ensure accuracy, and meet regulatory requirements.</li>
<li>Navigate multi-jurisdiction complexity: Design for customers operating across different countries, each with unique regulatory frameworks, compliance requirements, and business practices.</li>
<li>Champion data accuracy and audit-ability: Every feature you build must maintain proper audit trails, ensure data integrity, and support compliance requirements.</li>
</ul>
<p><strong>AI-Powered Automation:</strong></p>
<ul>
<li>Identify high-impact automation opportunities: Find where finance teams lose hours to repetitive, manual work, and determine where AI can transform workflows.</li>
<li>Balance AI capabilities with accounting precision: Understand when AI-powered automation is appropriate vs. when deterministic rules are required.</li>
<li>Prototype and validate AI solutions: Personally test AI-powered approaches to validate whether they actually work before committing engineering resources.</li>
<li>Ship AI features that accountants trust: Build transparency into AI decisions, enable easy overrides, and maintain human-in-the-loop workflows where precision matters.</li>
</ul>
<p><strong>System Integration &amp; Data Architecture:</strong></p>
<ul>
<li>Own the integration strategy: Define how Jeeves connects with ERP systems, accounting software, and other platforms that finance teams rely on.</li>
<li>Design for flexibility and configuration: Build systems that adapt to different customer setups, organizational structures, and business requirements.</li>
<li>Ensure data integrity end-to-end: Every transaction must reconcile.</li>
</ul>
<p><strong>Customer Discovery &amp; Requirements Gathering:</strong></p>
<ul>
<li>Be the expert on customer needs: Conduct regular interviews with controllers, accountants, finance managers, and CFOs to understand pain points in their current workflows.</li>
<li>Partner with customer finance teams: Shadow their processes, understand their requirements, and learn the nuances of how they work.</li>
<li>Convert pain into product vision: Transform qualitative feedback and workflow observations into clear product opportunities.</li>
</ul>
<p><strong>Data Analysis &amp; Insight Generation:</strong></p>
<ul>
<li>Track adoption and impact metrics: Measure how customers use the product, where they struggle, and what drives value.</li>
<li>Analyze workflow data: Identify patterns in customer behavior, common pain points, and opportunities for improvement.</li>
<li>Build business cases: Quantify the ROI of product investments,hours saved for customers, error reduction, improved workflows, and business impact.</li>
</ul>
<p><strong>Strategy &amp; Vision:</strong></p>
<ul>
<li>Define the product roadmap: Build a clear point of view on where the market is heading, how customer needs are evolving, and how Jeeves can lead the way.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>accounting, product management, financial analysis, data analysis, AI-powered automation, system integration, data architecture, customer discovery, requirements gathering, insight generation, business case development, machine learning, natural language processing, cloud computing, containerization, DevOps, agile development, scrum, kanban</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Jeeves</Employername>
      <Employerlogo>https://logos.yubhub.co/jeeves.com.png</Employerlogo>
      <Employerdescription>Jeeves is a financial operating system that provides corporate cards, cross-border payments, and spend management software within one unified platform. It operates across 20+ countries and serves over 5,000 clients.</Employerdescription>
      <Employerwebsite>https://www.jeeves.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/tryjeeves/3403bcbd-87c1-4790-99d3-5635eb8670e1</Applyto>
      <Location>São Paulo</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>651a6835-81f</externalid>
      <Title>Member of Revenue Strategy &amp; Operations, Marketing Analytics</Title>
      <Description><![CDATA[<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto. We are looking for a high-impact Marketing Analytics &amp; Operations Manager to architect and scale our marketing data ecosystem.</p>
<p>This role goes beyond reporting,you will be responsible for building the foundation that powers our go-to-market strategy, from attribution modeling to executive decision-making. You will partner closely with marketing, sales, and revenue leadership to connect marketing efforts to business outcomes, enabling smarter investments and accelerating growth in the institutional crypto ecosystem.</p>
<p><strong>Core Competencies</strong></p>
<ul>
<li>Advanced SQL proficiency: Ability to extract, transform, and analyze large datasets across multiple systems to generate insights</li>
<li>Salesforce expertise (2–3+ years): Hands-on experience as a Salesforce admin supporting marketing and GTM teams, including campaign tracking, attribution, and data architecture</li>
<li>Marketing data architecture ownership: Experience integrating and scaling data across tools (e.g., CRM, marketing automation, analytics platforms)</li>
<li>Analytical rigor + creative problem-solving: Strong critical thinker who can challenge assumptions, identify root causes, and design innovative solutions</li>
<li>Cross-functional leadership: Proven ability to collaborate across marketing, sales, RevOps, and product teams to drive aligned outcomes</li>
</ul>
<p><strong>Technical Skills:</strong></p>
<p>Technical &amp; Analytical Ownership</p>
<ul>
<li>Own and evolve marketing data infrastructure, including Salesforce and integrated MarTech systems</li>
<li>Write and optimize SQL queries to analyze campaign performance, pipeline generation, and revenue attribution</li>
<li>Build and maintain scalable dashboards and reporting frameworks across:</li>
<li>Performance marketing</li>
<li>BDR outbound</li>
<li>Field &amp; event marketing</li>
<li>Develop multi-touch attribution models and conversion tracking methodologies</li>
<li>Analyze RoAS, LTV/CAC, and funnel efficiency to inform investment decisions</li>
</ul>
<p><strong>Complexity and Impact of Work:</strong></p>
<p>Systems &amp; Data Architecture</p>
<ul>
<li>Serve as Salesforce admin for marketing, ensuring clean data structures, campaign tracking, and attribution integrity</li>
<li>Assess and optimize the marketing tech stack, identifying gaps and opportunities for automation and scale</li>
<li>Partner closely with Data teams to ensure data consistency and governance across systems</li>
</ul>
<p><strong>Organizational Knowledge:</strong></p>
<ul>
<li>Produce executive-ready dashboards, reports, and presentations that clearly communicate marketing performance and ROI</li>
<li>Provide actionable insights to support budget allocation, channel strategy, and campaign prioritization</li>
<li>Conduct deep-dive analyses into client journeys, segmentation, and growth drivers</li>
</ul>
<p><strong>Communication and Influence:</strong></p>
<ul>
<li>Act as a key partner to Marketing, Sales, BDR, and Relationship Management teams</li>
<li>Translate business needs into data requirements and technical solutions</li>
<li>Influence stakeholders through clear communication, strong storytelling, and data-backed recommendations</li>
</ul>
<p><strong>What Sets You Apart:</strong></p>
<ul>
<li>Deep understanding of the institutional crypto landscape and client lifecycle</li>
<li>Experience evaluating and implementing MarTech tools and integrations</li>
<li>Ability to move fluidly between technical execution and strategic thinking</li>
<li>Strong storytelling skills,turning data into clear, compelling narratives for executives</li>
</ul>
<p><strong>You may be a fit for this role if you have:</strong></p>
<ul>
<li>2–3+ years of Salesforce experience supporting marketing or GTM teams (admin-level ownership preferred)</li>
<li>Strong hands-on experience with SQL and data analysis (required)</li>
<li>Experience building marketing dashboards, attribution models, and performance reporting systems</li>
<li>Familiarity with institutional crypto, fintech, or financial services and how GTM operates in these environments</li>
<li>A track record of building or improving data infrastructure and analytics frameworks</li>
<li>A scrappy, ownership-driven mindset,comfortable wearing multiple hats and operating in a fast-paced environment</li>
</ul>
<p><strong>Although not a requirement, bonus points if:</strong></p>
<ul>
<li>Experience with data warehousing solutions (e.g., Snowflake, BigQuery, Redshift)</li>
<li>Exposure to BI tools (e.g., Looker, Tableau, Hex)</li>
<li>You were emotionally moved by the soundtrack to Hamilton, which chronicles the founding of a new financial system. :)</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Advanced SQL proficiency, Salesforce expertise, Marketing data architecture ownership, Analytical rigor + creative problem-solving, Cross-functional leadership, Data warehousing solutions, BI tools, Institutional crypto landscape and client lifecycle, MarTech tools and integrations, Strong storytelling skills</Skills>
      <Category>Marketing</Category>
      <Industry>Finance</Industry>
      <Employername>Anchorage Digital</Employername>
      <Employerlogo>https://logos.yubhub.co/anchorage.com.png</Employerlogo>
      <Employerdescription>Anchorage Digital is a crypto platform that enables institutions to participate in digital assets through custody, staking, trading, governance, settlement, and the industry&apos;s leading security infrastructure. It has a Series D valuation over $3 billion.</Employerdescription>
      <Employerwebsite>https://anchorage.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/anchorage/0a8757c6-e3e9-42d5-aa15-9b779f2e8c16</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>e5f7b923-e21</externalid>
      <Title>Data Science Manager</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Data Science Manager to join our team at Stripe. As a Data Science Manager, you will be responsible for the success of your team, driving the roadmap and priorities, collaborating with stakeholders, and managing a high-performing team of data scientists.</p>
<p>The MaaS Data Science team is central to all money movements, embedded finance, and platform solutions for our biggest and most complex customers. The two roles available are in Embedded Finance (Capital, Issuing) and Connect (Stripe&#39;s solution and growth suite for platforms and marketplaces).</p>
<p>The Growth Data Science team helps businesses on Stripe get started both quickly and effectively. We work closely with Growth product and engineering leads to optimize every step of the user journey, from awareness and acquisition, through product adoption, to usage growth and retention.</p>
<p>Responsibilities:</p>
<ul>
<li>Drive the roadmap and priorities for your team, and work with many Stripe leaders across the company to enhance our ability to be data driven.</li>
<li>Collaborate with stakeholders across the organization such as engineering, analytics, operations, finance, and marketing.</li>
<li>Lead and manage processes to help the team do its best work and engage effectively with the rest of Stripe.</li>
<li>Manage a high-performing team of data scientists, supporting them to achieve a high level of technical excellence and advance in their careers.</li>
<li>Recruit and onboard great data scientists, in collaboration with Stripe’s recruiting team.</li>
<li>Contribute to broad data science initiatives as a member of Stripe’s data science management team.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>PhD or MS or BS in a quantitative field (e.g., Statistics, Operations Research, Economics, Computer Science, Engineering).</li>
<li>At least 3 years of direct management experience leading data science or ML teams, and 10 years of overall data science experience.</li>
<li>Demonstrated expertise in designing metrics and guiding business decisions with data.</li>
<li>Technical expertise to drive clarity with staff and senior scientists about architecture and strategic modeling decisions.</li>
<li>Managed teams that have built and shipped machine learning systems and data products at scale, and have hands-on experience with challenging problems.</li>
<li>Work very well cross-functionally, and are able to think rigorously and make hard decisions and tradeoffs.</li>
<li>Clear and persuasive communication skills in writing and verbally.</li>
<li>Thrive on a high level of autonomy and responsibility.</li>
<li>Foster a healthy, inclusive, challenging, and supportive work environment.</li>
</ul>
<p>Preferred Requirements:</p>
<ul>
<li>Comfortable working with geographically distributed teams.</li>
<li>Expertise in time series forecasting, predictive modeling, or optimization.</li>
<li>Expertise in data design and building scalable data architectures.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>PhD or MS or BS in a quantitative field, At least 3 years of direct management experience leading data science or ML teams, 10 years of overall data science experience, Demonstrated expertise in designing metrics and guiding business decisions with data, Technical expertise to drive clarity with staff and senior scientists about architecture and strategic modeling decisions, Time series forecasting, Predictive modeling, Optimization, Data design, Scalable data architectures</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Stripe</Employername>
      <Employerlogo>https://logos.yubhub.co/stripe.com.png</Employerlogo>
      <Employerdescription>Stripe is a financial infrastructure platform for businesses, used by millions of companies worldwide.</Employerdescription>
      <Employerwebsite>https://stripe.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/stripe/jobs/7644403</Applyto>
      <Location>Seattle, WA OR New York, NY OR Remote North America</Location>
      <Country></Country>
      <Postedate>2026-03-31</Postedate>
    </job>
    <job>
      <externalid>51fb35f8-ae2</externalid>
      <Title>Data Engineer</Title>
      <Description><![CDATA[<p>We are seeking an experienced Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining large-scale data systems and pipelines. You will work closely with cross-functional teams to ensure seamless integration with existing systems and to drive business growth through data-driven insights.</p>
<p>Responsibilities:</p>
<ul>
<li>Design and develop scalable data architectures using cloud-based technologies such as AWS and Azure</li>
<li>Develop and maintain ETL processes to extract, transform, and load data from various sources</li>
<li>Collaborate with data scientists to develop and deploy machine learning models</li>
<li>Ensure data quality, security, and compliance with regulatory requirements</li>
<li>Work with stakeholders to identify business needs and develop data solutions</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Engineering, or related field</li>
<li>3+ years of experience in data engineering or a related field</li>
<li>Strong understanding of data architecture, design patterns, and best practices</li>
<li>Experience with cloud-based technologies such as AWS and Azure</li>
<li>Proficiency in programming languages such as Python, Java, or C++</li>
<li>Excellent problem-solving skills and attention to detail</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Master&#39;s degree in Computer Science, Engineering, or related field</li>
<li>Experience with big data technologies such as Hadoop, Spark, or NoSQL databases</li>
<li>Familiarity with data visualization tools such as Tableau, Power BI, or D3.js</li>
<li>Certification in data engineering or a related field</li>
</ul>
<p>Benefits:</p>
<ul>
<li>Competitive salary and benefits package</li>
<li>Opportunity to work with a leading technology business</li>
<li>Collaborative and dynamic work environment</li>
<li>Professional development opportunities</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>AWS, Azure, Python, Java, C++, ETL, data architecture, data design patterns, data quality, data security, regulatory compliance, Hadoop, Spark, NoSQL databases, Tableau, Power BI, D3.js</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Williams Advanced Engineering</Employername>
      <Employerlogo>https://logos.yubhub.co/williamsadvancedengineering.com.png</Employerlogo>
      <Employerdescription>Williams Advanced Engineering is a technology business that operates at the intersection of motorsport and industry. It generates revenue through engineering services.</Employerdescription>
      <Employerwebsite>https://www.williamsadvancedengineering.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.williamsf1.com/job/trackside-operations-lead-hospitality-in-london-jid-494</Applyto>
      <Location>Grove</Location>
      <Country></Country>
      <Postedate>2026-03-12</Postedate>
    </job>
    <job>
      <externalid>f763de0a-b86</externalid>
      <Title>Senior AI Deployment Strategist</Title>
      <Description><![CDATA[<p>As a Senior AI Deployment Strategist, you will bridge the gap between vision and execution, ensuring our customers not only see the potential of AI but realize its value in their operations.</p>
<p>This role is a unique blend of strategic advisory and hands-on deployment leadership, spanning both presales and postsales phases. You will act as a trusted advisor to C-suite executives, architecting AI solutions that drive transformation and delivering measurable business outcomes.</p>
<p>Responsibilities:</p>
<ul>
<li>Serve as the lead strategic advisor for a portfolio of high-value enterprise clients, building trusted relationships with executive leadership (CEO, CTO, CIO).</li>
<li>Lead C-suite workshops to diagnose business drivers and co-create multi-year AI transformation roadmaps aligned with corporate strategy.</li>
<li>Develop and present compelling business cases and proposals, articulating the ROI of AI adoption and the value of Mistral&#39;s platform.</li>
</ul>
<p>Presales: Shaping the Vision</p>
<ul>
<li>Partner with sales and product teams to identify and qualify opportunities, positioning Mistral&#39;s solutions as the catalyst for enterprise transformation.</li>
<li>Design tailored AI strategies and proof-of-concepts that address critical business challenges and demonstrate tangible value.</li>
<li>Act as a subject matter expert in client engagements, ensuring our solutions are understood, trusted, and adopted.</li>
</ul>
<p>Postsales: Driving Deployment &amp; Adoption</p>
<ul>
<li>Own the end-to-end success of AI deployments, from strategic planning to operational integration, ensuring solutions are embedded into the customer&#39;s core workflows.</li>
<li>Lead cross-functional teams of engineers, data scientists, and product managers to execute deployment roadmaps and deliver measurable results.</li>
<li>Navigate organizational complexity and drive change management to ensure seamless adoption and long-term success.</li>
</ul>
<p>Commercial Growth &amp; Thought Leadership</p>
<ul>
<li>Identify and cultivate expansion opportunities within accounts, connecting Mistral&#39;s capabilities to new business challenges and driving commercial growth.</li>
<li>Mentor junior strategists and contribute to the development of best practices, methodologies, and playbooks for customer engagement.</li>
<li>Represent Mistral AI as a thought leader through speaking engagements, executive briefings, and industry contributions.</li>
</ul>
<p>About you</p>
<ul>
<li>You have 8+ years of experience in technical management consulting, enterprise technology, or a customer-facing strategic role, with a focus on AI/ML, data architecture, or digital transformation.</li>
<li>You hold a degree in a relevant scientific field (e.g., Computer Science, Data Science, Engineering, etc.)</li>
<li>You possess a deep, practical expertise in AI/ML concepts, enterprise data architecture, and modern software development with hands-on development of AI solutions—enough to credibly engage with technical teams and executives alike.</li>
<li>You have a proven track record of engaging and influencing C-suite executives, with the ability to navigate complex conversations and align AI strategies with business objectives.</li>
<li>You excel at structuring ambiguous problems into actionable programs and thrive in fast-paced, high-stakes environments.</li>
<li>You are equally comfortable in presales and postsales: shaping deals, designing solutions, and ensuring their successful deployment.</li>
<li>You have experience managing high-value accounts, understanding deal mechanics, and building compelling cases for investment.</li>
<li>You are a collaborative leader who can motivate cross-functional teams and drive results through influence, not authority.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>AI/ML concepts, enterprise data architecture, modern software development, technical management consulting, customer-facing strategic role, digital transformation</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mistral AI</Employername>
      <Employerlogo></Employerlogo>
      <Employerdescription>Mistral AI is an AI technology company that offers high-performance, optimized, open-source and cutting-edge models, products and solutions.</Employerdescription>
      <Employerwebsite>https://mistral.ai/careers</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/mistral/0004f890-99d5-47c5-bb67-8f3f76a1e08f</Applyto>
      <Location>Paris</Location>
      <Country></Country>
      <Postedate>2026-03-10</Postedate>
    </job>
    <job>
      <externalid>1ed2b1e7-a5a</externalid>
      <Title>MS Dynamics Solution Lead</Title>
      <Description><![CDATA[<p>We are seeking an accomplished senior leader to spearhead our Microsoft Dynamics CRM and Power Platform practice, with strong expertise in Azure, integrations, and data solutions. This individual will drive end-to-end presales, solutioning, and business development for the Financial Services domain, ensuring proactive engagement with clients and successful delivery of complex enterprise solutions.</p>
<p><strong>Key Responsibilities:</strong></p>
<p><strong>Presales &amp; Solutioning</strong>: Own end-to-end presales activities including RFP responses, proposals, and proactive pursuits. Design and present innovative solutions leveraging Dynamics CRM, Power Platform, Azure services, and integration frameworks. Collaborate with sales teams to identify opportunities and craft winning strategies.</p>
<p><strong>Domain Expertise</strong>: Apply deep knowledge of Financial Services processes and regulatory requirements to solution design. Act as a trusted advisor for clients on digital transformation initiatives.</p>
<p><strong>Business Development</strong>: Drive proactive pipeline generation and client engagement. Build strong relationships with key stakeholders and influence decision-making at senior levels.</p>
<p><strong>Team Enablement</strong>: Mentor and guide solution architects and consultants. Foster a culture of innovation and continuous learning within the practice</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Microsoft certifications in Dynamics 365, Power Platform, Azure &amp; Gen AI (Copilot) expertise, Proven experience in Microsoft Dynamics CRM, Power Platform, and Azure ecosystem, Strong understanding of integration patterns, data architecture, and cloud-native solutions, Experience in global delivery models and large-scale transformation programmers, Extensive experience in Financial Services (banking, insurance, capital markets), Demonstrated success in leading RFP responses, solution design, and client presentations, Ability to translate business requirements into scalable technical solutions, Track record of building practices, driving revenue growth, and managing senior client relationships, Excellent communication, negotiation, and stakeholder management skills</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global business and technology transformation partner, helping organisations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in more than 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/5wqq4o3bjNQjiqmDjwSRXH/hybrid-ms-dynamics-solution-lead-in-hyderabad-at-capgemini</Applyto>
      <Location>Hyderabad</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>3437e4dc-7d6</externalid>
      <Title>后端工程师 (Kotlin) - 深圳, 中国 (Senior)</Title>
      <Description><![CDATA[<p><strong>Job Overview</strong></p>
<p>We are seeking a senior backend engineer to join our team in Shenzhen, China. As a senior backend engineer, you will be responsible for designing, developing, and optimizing our core backend systems, including payment gateways, settlement systems, mobile POS integrations, and financial service APIs.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design, develop, and optimize high-performance backend services and APIs (REST/gRPC) using Kotlin or Java</li>
<li>Participate in backend architecture design to ensure system high availability and scalability</li>
<li>Troubleshoot and optimize system performance issues to ensure system stability</li>
<li>Collaborate closely with frontend, product, and testing teams to drive project delivery</li>
<li>Write high-quality, maintainable code and participate in code reviews</li>
<li>Stay up-to-date with the latest trends in backend and cloud technologies</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5+ years of experience in backend development (Kotlin or Java)</li>
<li>Familiarity with microservices architecture, cloud deployment (AWS/GCP), and CI/CD pipelines (GitHub Actions)</li>
<li>Proficiency in SQL/NoSQL databases (PostgreSQL, MySQL, Redis)</li>
<li>Familiarity with Docker, Kubernetes, and message queue systems (Kafka)</li>
<li>Familiarity with API design (REST/gRPC) and version control (GitHub)</li>
<li>Familiarity with agile development processes and good team collaboration skills</li>
<li>Good English communication skills (Mandarin or Cantonese is a plus)</li>
<li>Experience in financial technology, payment, or settlement systems is a plus</li>
</ul>
<p><strong>Preferred Skills</strong></p>
<ul>
<li>High concurrency and large-scale system development experience</li>
<li>Understanding of DevOps, CI/CD pipelines, and ability to drive automation deployment and operations</li>
<li>Experience in distributed systems or data architecture design</li>
<li>Contributions to open-source communities or personal technical blogs</li>
</ul>
<p><strong>Why Join Kody?</strong></p>
<ul>
<li>Global technology company with offices in Singapore, London, and Hong Kong</li>
<li>Flexible work arrangements in Shenzhen and Hong Kong</li>
<li>Technology-driven culture with engineers having core influence on product decisions</li>
<li>Challenging projects, including large-scale backend architecture design and optimization</li>
<li>Competitive salary and benefits to reward your technical contributions</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Kotlin, Java, microservices architecture, cloud deployment, CI/CD pipelines, SQL/NoSQL databases, Docker, Kubernetes, message queue systems, API design, version control, agile development processes, high concurrency and large-scale system development experience, DevOps, CI/CD pipelines, distributed systems or data architecture design, open-source communities, personal technical blogs</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Kody</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Kody is a fast-growing Fintech company that provides innovative payment and settlement solutions to hotels, restaurants, and retail brands.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/nt225i4DCrp5uzosmH4FXU/%E5%90%8E%E7%AB%AF%E5%B7%A5%E7%A8%8B%E5%B8%88-(kotlin)---%E6%B7%B1%E5%9C%B3%2C-%E4%B8%AD%E5%9B%BD-(senior)-in-shenzhen-at-kody</Applyto>
      <Location>深圳, 中国</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7af16166-8fd</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p><strong>Key Responsibilities:</strong></p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
<li>Mentor developers and create reference implementations/frameworks.</li>
<li>Partners with System Architects to elaborate capabilities and features.</li>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Over 6 years of experience as a senior domain architect for Data domains</li>
<li>Advanced English Level</li>
<li>Masters&#39; degree (PLUS)</li>
<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>
</ul>
<p><strong>Technical &amp; Business Skills:</strong></p>
<ul>
<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>
<li>Data Architecture / Data Modeling – Advanced (MUST)</li>
<li>Data Warehouse – Advanced (MUST)</li>
<li>Cloud Data Platforms - Advanced</li>
<li>Data Integration Tools – Advanced</li>
<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>
<li>Any Cloud - Intermediate (4-6 Years)</li>
<li>Power BI or Tableau - Intermediate (4-6 Years)</li>
<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>
<li>Data Lakehouse – Intermediate (MUST)</li>
</ul>
<ul>
<li>Data Governance - Intermediate</li>
<li>AI/ML - Entry Level (PLUS)</li>
<li>Master Data Management - Intermediate</li>
<li>Operational Data Management - Intermediate</li>
</ul>
<p><strong>Benefits:</strong></p>
<p>This position comes with a competitive compensation and benefits package.</p>
<ul>
<li>A competitive salary and performance-based bonuses.</li>
<li>Comprehensive benefits package.</li>
<li>Flexible work arrangements (remote and/or office-based).</li>
<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>
<li>Private Health Insurance.</li>
<li>Paid Time Off.</li>
<li>Training &amp; Development opportunities in partnership with renowned companies.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, AI/ML, Master Data Management, Operational Data Management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global consulting and technology services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/jdUFHSPZZjHsgd3TR4R3BS/remote-fbs-senior-data-domain-architect-in-colombia-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>555d0422-965</externalid>
      <Title>后端工程师 (Kotlin) - 深圳, 中国</Title>
      <Description><![CDATA[<p><strong><strong>Job Overview</strong></strong></p>
<p>We are seeking a senior Kotlin backend engineer to join our team in Shenzhen, China. As a backend engineer, you will be responsible for designing, developing, and optimizing our backend services to support high-traffic and high-performance applications.</p>
<p><strong><strong>Responsibilities</strong></strong></p>
<ul>
<li><strong>Backend Development</strong>: Design, develop, and optimize backend APIs and services using Kotlin (or Java).</li>
<li><strong>Architecture Design</strong>: Participate in system architecture design to ensure code quality, scalability, and high performance.</li>
<li><strong>Database Management</strong>: Optimize SQL/NoSQL databases to improve data access efficiency.</li>
<li><strong>API Design and Development</strong>: Build RESTful or gRPC APIs to integrate with frontend and mobile teams.</li>
<li><strong>Microservices and Cloud Deployment</strong>: Develop scalable microservices architecture and deploy on Docker, Kubernetes, AWS/GCP.</li>
<li><strong>Performance Optimization</strong>: Identify and resolve system performance bottlenecks to improve application efficiency.</li>
<li><strong>Testing and Code Quality</strong>: Write unit tests, integration tests to ensure code stability.</li>
<li><strong>Cross-Team Collaboration</strong>: Work closely with frontend engineers, product managers, QA teams to drive product delivery.</li>
</ul>
<p><strong><strong>Requirements</strong></strong></p>
<ul>
<li><strong>3+ years of experience in Kotlin or Java backend development</strong>, familiar with Spring Boot, Ktor, or similar frameworks.</li>
<li>Familiar with PostgreSQL, MySQL, MongoDB, or Redis for database management and optimization.</li>
<li>Familiar with RESTful API, GraphQL, or gRPC design and development.</li>
<li>Familiar with microservices architecture, Docker, Kubernetes, and cloud deployment (AWS, GCP, or Azure).</li>
<li>Familiar with multi-threading, asynchronous programming, and message queues (Kafka, RabbitMQ) for performance optimization.</li>
<li><strong>Good English communication skills</strong> to collaborate with international teams.</li>
<li>Currently residing in Shenzhen or surrounding areas, with regular office work and occasional trips to Hong Kong.</li>
</ul>
<p><strong><strong>Preferred Skills</strong></strong></p>
<ul>
<li>Experience with high-concurrency and large-scale system development.</li>
<li>Understanding of DevOps, CI/CD pipelines to drive automation deployment and operations.</li>
<li>Experience with distributed systems or data architecture design.</li>
<li>Contributions to open-source projects or personal technical blogs.</li>
</ul>
<p><strong><strong>Why Join Kody?</strong></strong></p>
<ul>
<li><strong>Global fintech company</strong> with offices in Singapore, London, and Hong Kong.</li>
<li><strong>Remote work + Hong Kong office</strong> with flexible work arrangements.</li>
<li><strong>Technology-driven culture</strong> with engineers having core influence in product decisions.</li>
<li><strong>Challenging projects</strong> with large-scale backend architecture design and optimization.</li>
<li><strong>Competitive salary and benefits</strong> to reward your technical contributions.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Kotlin, Java, Spring Boot, Ktor, PostgreSQL, MySQL, MongoDB, Redis, RESTful API, GraphQL, gRPC, Docker, Kubernetes, AWS, GCP, Azure, multi-threading, asynchronous programming, message queues, high-concurrency, large-scale system development, DevOps, CI/CD pipelines, distributed systems, data architecture design, open-source projects, personal technical blogs</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Kody</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Kody is a fintech company that provides online payment solutions to brick and mortar businesses.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/j11pr7wKpF3kZfgRFDHhpD/%E5%90%8E%E7%AB%AF%E5%B7%A5%E7%A8%8B%E5%B8%88-(kotlin)---%E6%B7%B1%E5%9C%B3%2C-%E4%B8%AD%E5%9B%BD-in-shenzhen-at-kody</Applyto>
      <Location>深圳, 中国</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>b06e2fe0-1bd</externalid>
      <Title>后端工程师 (Kotlin)</Title>
      <Description><![CDATA[<p><strong><strong>Job Description</strong></strong></p>
<p>We are seeking a senior Kotlin backend engineer to join our team. As a backend engineer, you will be responsible for designing, developing, and optimizing backend services to support high-traffic and high-performance applications.</p>
<p><strong><strong>Responsibilities</strong></strong></p>
<ul>
<li><strong>Backend Development</strong>: Design, develop, and optimize backend APIs and services using Kotlin (or Java).</li>
<li><strong>Architecture Design</strong>: Participate in system architecture design to ensure code quality, scalability, and high performance.</li>
<li><strong>Database Management</strong>: Optimize SQL/NoSQL databases to improve data access efficiency.</li>
<li><strong>API Design and Development</strong>: Develop RESTful or gRPC APIs to integrate with frontend and mobile teams.</li>
<li><strong>Microservices and Cloud Deployment</strong>: Develop scalable microservices architecture and deploy using Docker, Kubernetes, AWS/GCP.</li>
<li><strong>Performance Optimization</strong>: Identify and resolve system performance bottlenecks to improve application efficiency.</li>
<li><strong>Testing and Code Quality</strong>: Write unit tests, integration tests to ensure code stability.</li>
<li><strong>Cross-Team Collaboration</strong>: Work closely with frontend engineers, product managers, QA teams to drive product delivery.</li>
</ul>
<p><strong><strong>Requirements</strong></strong></p>
<ul>
<li><strong>3+ years of experience in Kotlin or Java backend development</strong>, familiar with Spring Boot, Ktor, or similar frameworks.</li>
<li>Familiar with PostgreSQL, MySQL, MongoDB, or Redis for database management and optimization.</li>
<li>Familiar with RESTful API, GraphQL, or gRPC design and development.</li>
<li>Familiar with microservices architecture, Docker, Kubernetes, and cloud deployment (AWS, GCP, or Azure).</li>
<li>Familiar with multi-threading, asynchronous programming, and message queues (Kafka, RabbitMQ) for performance optimization.</li>
<li><strong>Good English communication skills</strong> to collaborate with international teams.</li>
<li>Currently residing in Shenzhen or surrounding areas, with regular office work and occasional trips to Hong Kong.</li>
</ul>
<p><strong><strong>Preferred Skills</strong></strong></p>
<ul>
<li>Experience with high-concurrency and large-scale system development.</li>
<li>Understanding of DevOps, CI/CD pipelines to drive automation deployment and operations.</li>
<li>Experience with distributed systems or data architecture design.</li>
<li>Contributions to open-source projects or personal technical blogs.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Kotlin, Java, Spring Boot, Ktor, PostgreSQL, MySQL, MongoDB, Redis, RESTful API, GraphQL, gRPC, Docker, Kubernetes, AWS, GCP, Azure, Multi-threading, Asynchronous programming, Message queues, High-concurrency system development, DevOps, CI/CD pipelines, Distributed systems, Data architecture design</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Kody</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Kody is a fintech company that provides online payment solutions to brick and mortar businesses.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/gt1SZS2RXmB6Dmj8oD99Uj/%E5%90%8E%E7%AB%AF%E5%B7%A5%E7%A8%8B%E5%B8%88-(kotlin)---%E6%B7%B1%E5%9C%B3%2C-%E4%B8%AD%E5%9B%BD-in-shenzhen-at-kody</Applyto>
      <Location>深圳, 中国</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>2a56a653-c18</externalid>
      <Title>Palantir Engineer Specialist - Sr. Consultant - Principal</Title>
      <Description><![CDATA[<p><strong>Palantir Engineer Specialist</strong></p>
<p><strong>Sr. Consultant - Principal</strong></p>
<p><strong>London</strong></p>
<p>Do you want to boost your career and collaborate with expert, talented colleagues to solve and deliver against our clients&#39; most important challenges? We are growing and are looking for people to join our team. You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organisation allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset. Are you ready?</p>
<p><strong>About Your Role</strong></p>
<p>As a <strong>Senior Consultant / Principal Consultant – Palantir Engineer</strong>, you lead and deliver end-to-end, data-driven solutions using <strong>Palantir Foundry</strong> in complex client environments. You operate at the intersection of engineering, data, and consulting, working closely with business and technical stakeholders to translate complex problems into scalable, production-ready solutions. You combine strong hands-on technical skills with a consulting mindset, taking ownership of solution design, implementation, and adoption across organisations.</p>
<p><strong>Your role will include:</strong></p>
<ul>
<li>Own the <strong>end-to-end delivery</strong> of Palantir Foundry–based solutions, from problem definition to production</li>
<li>Design and implement <strong>data pipelines and transformations</strong> across diverse data sources</li>
<li>Model data using <strong>Foundry Ontology</strong> concepts to support analytics and operational use cases</li>
<li>Build scalable, reliable solutions using <strong>Python, SQL, and PySpark</strong> within Foundry</li>
<li>Collaborate closely with business stakeholders to define requirements, success metrics, and roadmaps</li>
<li>Support <strong>prototyping, productionisation, and scaling</strong> of data-driven applications</li>
<li>Ensure solutions meet requirements for <strong>data quality, governance, security, and performance</strong></li>
<li>Act as a technical advisor within project teams and contribute to best practices</li>
</ul>
<p><strong>Requirements</strong></p>
<p><strong>What you bring – required</strong></p>
<p><strong>Experience &amp; Seniority</strong></p>
<ul>
<li>Proven experience as a <strong>Senior Consultant or Principal Consultant</strong> in data, analytics, or platform engineering</li>
<li>Strong experience delivering <strong>client-facing data solutions</strong> in complex environments</li>
<li>Ability to take ownership and work independently in ambiguous problem spaces</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong programming skills in <strong>Python</strong> and <strong>SQL</strong>; <strong>PySpark</strong> experience required</li>
<li>Hands-on experience with <strong>Palantir Foundry</strong>, including:</li>
<li>Pipeline Builder / Code Workbook</li>
<li>Data integration and transformation</li>
<li>Ontology modelling and data lineage</li>
<li>Solid understanding of <strong>data architectures</strong>, including data lakes, lakehouses, and data warehouses</li>
<li>Experience working with APIs, databases, and structured / semi-structured data</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience building <strong>scalable ETL/ELT pipelines</strong></li>
<li>Familiarity with <strong>CI/CD concepts</strong>, testing, and production deployments</li>
<li>Strong focus on <strong>solution quality, maintainability, and performance</strong></li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field <strong>or equivalent practical experience</strong></li>
</ul>
<p><strong>Nice to have</strong></p>
<ul>
<li>Experience with <strong>cloud platforms</strong> (AWS, Azure, GCP)</li>
<li>Familiarity with <strong>containerisation</strong> (Docker, Kubernetes)</li>
<li>Prior experience as a <strong>Palantir FDE</strong> or in Foundry-heavy delivery roles</li>
<li>Domain experience in industries such as <strong>Energy, Finance, Public Sector, Healthcare, or Logistics</strong></li>
</ul>
<p><strong>Benefits</strong></p>
<p><strong>About your team</strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics strategy, Data Management &amp; Governance, Data Platforms &amp; engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognised as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognised by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, SQL, PySpark, Palantir Foundry, Pipeline Builder, Code Workbook, Data integration, Data transformation, Ontology modelling, Data lineage, Data architectures, Data lakes, Lakehouses, Data warehouses, APIs, Databases, Structured data, Semi-structured data, ETL/ELT pipelines, CI/CD concepts, Testing, Production deployments, Solution quality, Maintainability, Performance, Bachelor’s degree, Master’s degree, Computer Science, Engineering, Mathematics, Cloud platforms, Containerisation, Palantir FDE, Foundry-heavy delivery roles, Domain experience in industries such as Energy, Finance, Public Sector, Healthcare, or Logistics</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. The company is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/2A8U1ryerVijb4fFAc6i8u/hybrid-palantir-engineer-specialist---sr.-consultant---principal-in-london-at-infosys-consulting---europe</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7b03b30a-b20</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>
<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p>**Key Responsibilities:*</p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
</ul>
<ul>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
</ul>
<ul>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
</ul>
<ul>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
</ul>
<ul>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
</ul>
<ul>
<li>Mentor developers and create reference implementations/frameworks.</li>
</ul>
<ul>
<li>Partners with System Architects to elaborate capabilities and features.</li>
</ul>
<ul>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, Master Data Management, Operational Data Management, AI/ML</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/1U952YA2QBa8zK7Tm5d3Lm/remote-fbs-senior-data-domain-architect-in-mexico-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>055a769a-c68</externalid>
      <Title>后端工程师 (Kotlin) - 深圳, 中国 (Senior)</Title>
      <Description><![CDATA[<p><strong>Job Overview</strong></p>
<p>As a Senior Backend Engineer at Kody, you will be responsible for designing, developing, and optimizing our core backend systems, including payment gateways, settlement systems, mobile POS integrations, and financial service APIs.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design, develop, and optimize high-performance backend services and APIs (REST/gRPC) using Kotlin or Java</li>
<li>Participate in backend architecture design to ensure system high availability and scalability</li>
<li>Troubleshoot and optimize system performance issues to ensure system stability</li>
<li>Collaborate closely with frontend, product, and testing teams to drive project delivery</li>
<li>Write high-quality, maintainable code and participate in code reviews</li>
<li>Stay up-to-date with the latest developments in backend and cloud technologies</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5+ years of experience in backend development (Kotlin or Java)</li>
<li>Familiarity with microservices architecture, cloud deployment (AWS/GCP), and CI/CD pipelines (GitHub Actions)</li>
<li>Familiarity with SQL/NoSQL databases (PostgreSQL, MySQL, Redis)</li>
<li>Familiarity with Docker, Kubernetes, and message queue systems (Kafka)</li>
<li>Familiarity with API design (REST/gRPC) and version control (GitHub)</li>
<li>Familiarity with agile development processes and good team collaboration skills</li>
<li>Good English communication skills (Mandarin or Cantonese is a plus)</li>
<li>Experience in financial technology, payment, or settlement systems is a plus</li>
</ul>
<p><strong>Preferred Qualifications</strong></p>
<ul>
<li>Experience in high-concurrency, large-scale system development</li>
<li>Understanding of DevOps, CI/CD pipelines, and ability to drive automation deployment and operations</li>
<li>Experience in distributed systems or data architecture design</li>
<li>Contributions to open-source communities or personal technical blogs</li>
</ul>
<p><strong>Why Join Kody?</strong></p>
<ul>
<li>Global technology company with offices in Singapore, London, and Hong Kong</li>
<li>Flexible work arrangements in Shenzhen and Hong Kong</li>
<li>Technology-driven culture with engineers having core influence on product decisions</li>
<li>Challenging projects, including large-scale backend architecture design and optimization</li>
<li>Competitive salary and benefits to reward your technical contributions</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Kotlin, Java, microservices architecture, cloud deployment, CI/CD pipelines, SQL/NoSQL databases, Docker, Kubernetes, message queue systems, API design, version control, agile development processes, high-concurrency, large-scale system development, DevOps, CI/CD pipelines, distributed systems, data architecture design, open-source communities, personal technical blogs</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Kody</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Kody is a fast-growing Fintech company that provides innovative payment and settlement solutions to hotels, restaurants, and retail brands.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/2mZgGN3RMgykf9oavxvD49/%E5%90%8E%E7%AB%AF%E5%B7%A5%E7%A8%8B%E5%B8%88-(kotlin)---%E6%B7%B1%E5%9C%B3%2C-%E4%B8%AD%E5%9B%BD-(senior)-in-shenzhen-at-kody</Applyto>
      <Location>深圳, 中国</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>f867ca73-2e0</externalid>
      <Title>Lead Data Consultant (H/F) Paris</Title>
      <Description><![CDATA[<p><strong>A leading data company in Paris</strong></p>
<p>Fifty-Five is a global data company that helps brands collect, analyse and activate their data across paid, earned and owned channels to increase their marketing ROI and improve customer acquisition and retention.</p>
<p>We are looking for a Lead Data Consultant to join our team in Paris. As a Lead Data Consultant, you will be responsible for leading data projects and working closely with our clients to understand their data needs and develop solutions to meet those needs.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Lead data projects from start to finish, including data collection, analysis and activation</li>
<li>Work closely with clients to understand their data needs and develop solutions to meet those needs</li>
<li>Collaborate with our data team to develop and implement data strategies</li>
<li>Analyse data to identify trends and insights that can inform business decisions</li>
<li>Develop and maintain relationships with clients to ensure their data needs are met</li>
<li>Stay up-to-date with the latest data trends and technologies</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>3-6 years of experience in data analysis and consulting</li>
<li>Strong understanding of data analysis and statistical techniques</li>
<li>Experience working with large datasets and data visualisation tools</li>
<li>Excellent communication and project management skills</li>
<li>Ability to work independently and as part of a team</li>
<li>Strong analytical and problem-solving skills</li>
<li>Experience working with data platforms such as Google Analytics and Google Cloud Platform</li>
<li>Strong understanding of data privacy and security regulations</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary and benefits package</li>
<li>Opportunity to work with a leading data company in Paris</li>
<li>Collaborative and dynamic work environment</li>
<li>Professional development opportunities</li>
<li>Flexible working hours and remote work options</li>
<li>Access to the latest data tools and technologies</li>
<li>Opportunity to work on a variety of data projects and clients</li>
<li>Recognition and rewards for outstanding performance</li>
</ul>
<p><strong>How to Apply</strong></p>
<p>If you are a motivated and experienced data professional looking for a new challenge, please submit your application, including your resume and a cover letter, to [insert contact information]. We look forward to hearing from you!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data analysis, data visualisation, data strategy, data privacy, data security, Google Analytics, Google Cloud Platform, data science, machine learning, data engineering, data architecture</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Fifty-Five</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Fifty-Five is a global data company that helps brands collect, analyse and activate their data across paid, earned and owned channels to increase their marketing ROI and improve customer acquisition and retention. The company has over 320 experts and is part of The Brandtech Group.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/wPj5jcg35AZgUXYdKWsC6a/lead-data-consultant-(h%2Ff)-paris-in-paris-at-fifty-five</Applyto>
      <Location>Paris</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>dcfed817-412</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Senior Data Domain Architect to join our team. As a Senior Data Domain Architect, you will design and develop Data/Domain IT architecture solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Key Responsibilities:</strong></p>
<ul>
<li>Utilize in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery</li>
<li>Solve complex problems and partner effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization</li>
<li>Work independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge</li>
<li>Review high level design to ensure alignment to Solution Architecture</li>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives</li>
<li>Mentor developers and create reference implementations/frameworks</li>
<li>Partner with System Architects to elaborate capabilities and features</li>
<li>Deliver single domain architecture solutions and execute continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Over 6 years of experience as a senior domain architect for Data domains</li>
<li>Advanced English Level</li>
<li>Masters&#39; degree (PLUS)</li>
<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>
</ul>
<p><strong>Technical &amp; Business Skills:</strong></p>
<ul>
<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>
<li>Data Architecture / Data Modeling – Advanced (MUST)</li>
<li>Data Warehouse – Advanced (MUST)</li>
<li>Cloud Data Platforms - Advanced</li>
<li>Data Integration Tools – Advanced</li>
<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>
<li>Any Cloud - Intermediate (4-6 Years)</li>
<li>Power BI or Tableau - Intermediate (4-6 Years)</li>
<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>
<li>Data Lakehouse – Intermediate (MUST)</li>
</ul>
<p><strong>Benefits:</strong></p>
<ul>
<li>A competitive salary and performance-based bonuses</li>
<li>Comprehensive benefits package</li>
<li>Flexible work arrangements (remote and/or office-based)</li>
<li>Private Health Insurance</li>
<li>Paid Time Off</li>
<li>Training &amp; Development opportunities in partnership with renowned companies</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/x7tKXYFBB815ca6oBV5T2E/remote-fbs-senior-data-domain-architect-in-brazil-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>ee2fcbdc-fc4</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will act as a senior technical leader in complex data and analytics engagements, shaping and governing end-to-end enterprise data architectures, leading technical teams, and serving as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will be responsible for ensuring that enterprise data and analytics solutions are scalable, secure, and production-ready, while translating business requirements into robust technical designs and delivery roadmaps.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, Azure, AWS or GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, Postgres, SQL Server, Oracle, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, Docker / Kubernetes, Advanced analytics, AI / ML or GenAI, Streaming platforms (e.g. Kafka, Azure Event Hubs), Data governance or metadata tools, Cloud, data, or architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. The company is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/uuSzzCt8qNbo6UpEFkSyjY/hybrid-principal-consultant---data-architecture-in-london-at-infosys-consulting---europe</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>4fdc359b-d70</externalid>
      <Title>Enterprise Architect - Retail &amp; Omni-Channel - CRL - Germany</Title>
      <Description><![CDATA[<p>Do you want to boost your career and collaborate with expert, talented colleagues to solve and deliver against our clients&#39; most important challenges? We are growing and are looking for people to join our team. You&#39;ll be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset. Are you ready?</p>
<p>The Enterprise Architect - Retail &amp; Omni-Channel will collaborate with the other key areas of the company (business/IT) to ensure that plans are based on the overall Strategy and Architecture direction, defining new requirements and driving innovative business processes and technical solutions.</p>
<p>You will be responsible for setting the future direction of these capabilities, outlining a roadmap that meets the strategic needs of the organisation and working with the relevant teams to assure their delivery.</p>
<p><strong>Essential Duties and Responsibilities:</strong></p>
<ul>
<li>Collaborate with IT leaders, other Enterprise Domain Architects and leaders in the business to document IT capability requirements (strategic) as part of annual/on-going planning and governance.</li>
<li>Deliver Enterprise IT strategic roadmaps for Retail, eCommerce and Direct To Consumer channels (3+ year forward looking view) that are focused on technology capabilities for Business, Information (data) and Technology solutions.</li>
<li>Ensure that roadmaps reflect invest/divest, retirement, and consolidation strategies for systems globally and that industry best practices and benchmarks are incorporated into deliverables.</li>
<li>Ensuring that solution design is aligned with the target domain architecture and business roadmaps.</li>
<li>Lead and deliver enterprise evaluations and analyses that adhere to established global IT project and requirements management methodologies, governance/check-point, release management, and tools standards for approved projects requiring IT solutions</li>
<li>Partner with IT functions and their solution architects to design integrated IT solutions (including the build vs. buy decision) that translate business capability requirements and adhere to established global IT technology standards (infrastructure, application, integration, data).</li>
<li>Provide Architecture consulting to solution architects within delivery/execution teams (design, build/test, conversion/cutover) for business and technical projects across the globe.</li>
<li>Provide analysis and recommendations, via benchmarks and external industry point of view, on enterprise department performance for “Run and Maintain”.</li>
<li>Support IT and business leaders with technology advisory services for engaging with the business to “solve” business problems via technology and process innovation relative to IT capability requirements.</li>
<li>Act as informal mentor and coach to Solution Architects within other IT teams.</li>
<li>Contribute and represent Retail and Omni-Chanel at the IT Architectural Board.</li>
<li>Work directly with external vendors and advisory agencies to bring relevant best practices to the Enterprise Architecture team.</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Engineering or related discipline/experience preferred</li>
<li>Minimum 5+ years of IT experience in an architecture role responsible for both business systems support and the successful delivery and deployment of IT technology-based solutions</li>
<li>Well-versed in Enterprise Architecture, including network, data, application, system, and integration architecture.</li>
<li>TOGAF Certification or other architecture professional certification a plus</li>
</ul>
<p><strong>Experience</strong></p>
<ul>
<li>Apparel experience strongly desired, particularly with retail or wholesale.</li>
<li>Consumer products, and/or apparel retailing experience preferred.</li>
<li>SAP Retail or equivalent Retail Merchandising System experience preferred.</li>
<li>Hybris, ATG, Sterling or other headless eCommerce platforms experience preferred.</li>
<li>Point of Sale and other Retail System implementation experience preferred</li>
<li>Understanding of technology directions, trends and strategic business impacts related to the retail industry.</li>
<li>Proficiency in process management, technology evaluation, project management and business intelligence.</li>
<li>Familiarity with the balanced scorecard concept, IT standards such as ITIL and COBIT and compliance topics such as Sarbanes-Oxley (SOX) and Payment Cards Industry (PCI) standards are preferred.</li>
<li>Experience leading systems implementations to a successful conclusion.</li>
</ul>
<p>Clear understanding of the software development lifecycle (SDLC) and project and business process management methodologies and tools.</p>
<p>Excellent written and verbal communication skills and presentation skills.</p>
<p>Ability to serve as a change agent and influence business direction.</p>
<ul>
<li>Ability to work effectively in a global and matrixed environment.</li>
</ul>
<p><strong>About your team</strong></p>
<p>Our CRL (Consumer Goods, retail &amp; Logistics) practice helps some of the largest global firms and most recognizable local brands solve their biggest challenges in today’s age of constant disruption. With diverse services spanning growth strategy and new product innovation, to omni-channel customer experience, supply chain resiliency and AI-driven new business models, we help clients shape and achieve their growth agenda for a sustainable future. We transform traditional organizations to digitally centric business models and drive new revenue streams.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology.  We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, <strong>Equity</strong> and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal goals. Curious to learn more? We’d love to hear from you.... <strong>Apply today!</strong></p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise Architecture, Network Architecture, Data Architecture, Application Architecture, System Architecture, Integration Architecture, TOGAF Certification, SAP Retail, Hybris, ATG, Sterling, Point of Sale, Retail System implementation, Technology directions, Trends, Strategic business impacts, Process management, Technology evaluation, Project management, Business intelligence, ITIL, COBIT, Sarbanes-Oxley, Payment Cards Industry, PCI, Software development lifecycle, Project and business process management methodologies, Apparel experience, Consumer products, Apparel retailing, Retail Merchandising System, Headless eCommerce platforms, Balanced scorecard concept, IT standards, Compliance topics</Skills>
      <Category>Consulting</Category>
      <Industry>Consulting</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. The company is a mid-size player with a supportive, entrepreneurial spirit.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/rHp6ir9DAjZnXjrtsJHzjn/hybrid-enterprise-architect---retail-%26-omni-channel---crl---germany-in-munich-at-infosys-consulting---europe</Applyto>
      <Location>Munich, Bavaria, Germany</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>56dc9a51-e66</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will act as a senior technical leader in complex data and analytics engagements. You will shape and govern end-to-end enterprise data architectures, lead technical teams, and serve as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>You will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>enterprise data architecture, system data integration, data engineering, analytics, modern data architectures, Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, cloud data platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, SQL, relational databases, Postgres, SQL Server, Oracle, NoSQL databases, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, data migration programmes, data pipelines, orchestration, automation, CI/CD concepts, production-grade deployments, distributed systems, Docker, Kubernetes, data management and governance principles, data quality, metadata, lineage, master data management, data management software and tools, security, access control, compliance considerations, Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience, advanced analytics, AI / ML or GenAI, streaming platforms, Kafka, Azure Event Hubs, data governance or metadata tools, cloud, data, architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. It is a mid-size player with a supportive, entrepreneurial spirit.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/hpBWjvvy8D6B1f818cHxZR/remote-principal-consultant---data-architecture-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>fbb19758-f83</externalid>
      <Title>Principal Consultant Data Architecture (m/w/d)</Title>
      <Description><![CDATA[<p>Are you looking to advance your career and work with experienced, talented colleagues to successfully solve the most significant challenges of our clients? We are growing further and seeking engaged individuals to strengthen our team. You will be part of a dynamic, strongly growing company with over 300,000 employees.</p>
<p>Our dynamic organisation allows you to work across themes and bring in your ideas, experiences, creativity, and goal orientation. Are you ready?</p>
<p>As a Principal Consultant Data Architecture, you will be the technical leader in complex data and analytics projects. You will design and be responsible for comprehensive enterprise data architectures, lead technical teams, and be a trusted technical advisor for customers and internal stakeholders.</p>
<p>You will ensure that enterprise data and analytics solutions are scalable, secure, and operational, translate technical requirements into robust technical images, and plan the introduction.</p>
<p><strong>Your Tasks:</strong></p>
<ul>
<li>Definition and governance of target architectures for enterprise data, integration, and analytics in cloud and hybrid environments</li>
<li>Translation of business goals into scalable, secure, and compliant architectures</li>
<li>Leadership of the conception of comprehensive end-to-end data solutions (data intake, data integration, storage, security, processing, analytics, AI support)</li>
<li>Steering and accompanying delivery teams during implementation, rollout, and establishment of operational readiness</li>
<li>Senior technical contact person for architects, IT managers, and technical teams of customers</li>
<li>Mentoring of system and data architects as well as programmers</li>
<li>Participation in the further development of best practices and reference architectures</li>
<li>Support of presales and solution design activities from a technical perspective</li>
</ul>
<p><strong>What You Bring - Minimum Requirements</strong></p>
<p><strong>Experience &amp; Seniority</strong></p>
<ul>
<li>At least 5 years of relevant professional experience in enterprise data architecture, data integration, data engineering, or analytics</li>
<li>Experience in leading enterprise data architecture workstreams or technical teams</li>
<li>Strong customer and advisory experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>In-depth expertise in modern data architectures, particularly:</li>
</ul>
<ol>
<li>Data Mesh / Data Fabric / Data Lake / Data Warehouse Architectures</li>
<li>Principles of modern data architecture designs</li>
<li>Integration patterns for batch and streaming data</li>
<li>Data platform, DevOps, deployment, and security architectures</li>
<li>Analytics and AI enablement architectures</li>
</ol>
<ul>
<li>Practical experience with cloud data platforms, such as:</li>
</ul>
<ol>
<li>Azure, AWS, or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
</ol>
<ul>
<li>Very good SQL knowledge as well as experience with relational databases (e.g. PostgreSQL, SQL-Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Good understanding of API-based and event-driven architectures</li>
<li>Experience in conceiving and steering enterprise data migration programs (including mapping, transformation rules, data quality measures, etc.)</li>
</ul>
<p><strong>Engineering &amp; Platform Fundamentals</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Knowledge of CI/CD concepts and production-ready deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes knowledge is an advantage</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Very good understanding of data management and governance principles, particularly:</li>
</ul>
<ol>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data management software and tools</li>
<li>Security, access, and compliance requirements</li>
</ol>
<ul>
<li>Bachelor&#39;s or master&#39;s degree in computer science, engineering, mathematics, or a related field, or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Experience with advanced analytics, AI/ML, or GenAI from an architect&#39;s perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Practical experience with data governance or metadata tools</li>
<li>Cloud or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility (Germany)</strong></p>
<ul>
<li>Fluent German skills (at least C1) for customer communication in the German-speaking market</li>
<li>Very good English skills</li>
<li>Project-related travel readiness</li>
</ul>
<p><strong>About Your Team</strong></p>
<p>You will become part of our growing data and analytics teams. In this area, you will work with modern technologies in modern data ecosystems. You have the opportunity to turn your own ideas into results - in the areas of data and analytics strategy, data management and governance, data platforms and engineering, as well as analytics and data science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>You will become an employee of a globally renowned management consulting firm that is at the forefront of industry disruption. We work across industries with leading companies. Our culture is inclusive and entrepreneurial. As a mid-sized consulting firm embedded in the size of Infosys, we can support our customers worldwide and throughout the entire transformation process in a partnership-like manner.</p>
<p>Our values IC-LIFE - Inclusion, Equity &amp; Diversity, Client, Leadership, Integrity, Fairness, and Excellence - form our compass of values. Further information can be found on our career website.</p>
<p>In Europe, we are awarded by the Financial Times and Forbes as one of the leading consulting firms. Infosys is one of the top employers in Germany 2023 and has been certified by the Top Employers Institute for outstanding working conditions in Europe for five years in a row.</p>
<p>We offer a market-leading remuneration, attractive additional benefits, as well as excellent further education and development opportunities. Have you become curious? Then we look forward to your application</p>
<p>More about Infosys Consulting - Europe</p>
<p><strong>Visit website</strong></p>
<p>Where Innovation meets Excellence.</p>
<p>Infosys Consulting is a globally renowned management consulting firm that is on the front-line of industry disruption. We are a mid-size player with a supportive, entrepreneurial spirit that works with a market-leading brand in every sector, while our parent organization Infosys is a top-5 powerhouse IT brand that is outperforming the market and experiencing rapid growth.</p>
<p>Our consulting business is annually recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths we offer to our consultants. We are committed to fostering an inclusive work culture that inspires everyone to deliver their best.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Mesh, Data Fabric, Data Lake, Data Warehouse Architectures, Principles of modern data architecture designs, Integration patterns for batch and streaming data, Data platform, DevOps, deployment, and security architectures, Analytics and AI enablement architectures, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, PostgreSQL, SQL-Server, Oracle, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, Enterprise data migration programs</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting is a globally renowned management consulting firm that works with a market-leading brand in every sector, while its parent organization Infosys is a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/sve4gTuNFLf3RtEjhQMzHp/remote-principal-consultant-data-architecture-(m%2Fw%2Fd)--deutschlandweit-in-munich-at-infosys-consulting---europe</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>01be118d-100</externalid>
      <Title>Palantir Engineer Specialist - Sr. Consultant - Principal</Title>
      <Description><![CDATA[<p><strong><strong>About Your Role</strong></strong></p>
<p>As a Senior Consultant / Principal Consultant – Palantir Engineer, you will lead and deliver end-to-end, data-driven solutions using Palantir Foundry in complex client environments. You will operate at the intersection of engineering, data, and consulting, working closely with business and technical stakeholders to translate complex problems into scalable, production-ready solutions.</p>
<p><strong><strong>Your role will include:</strong></strong></p>
<ul>
<li>Own the end-to-end delivery of Palantir Foundry–based solutions, from problem definition to production</li>
<li>Design and implement data pipelines and transformations across diverse data sources</li>
<li>Model data using Foundry Ontology concepts to support analytics and operational use cases</li>
<li>Build scalable, reliable solutions using Python, SQL, and PySpark within Foundry</li>
<li>Collaborate closely with business stakeholders to define requirements, success metrics, and roadmaps</li>
<li>Support prototyping, productionisation, and scaling of data-driven applications</li>
<li>Ensure solutions meet requirements for data quality, governance, security, and performance</li>
<li>Act as a technical advisor within project teams and contribute to best practices</li>
</ul>
<p><strong><strong>Requirements</strong></strong></p>
<ul>
<li>Proven experience as a Senior Consultant or Principal Consultant in data, analytics, or platform engineering</li>
<li>Strong experience delivering client-facing data solutions in complex environments</li>
<li>Ability to take ownership and work independently in ambiguous problem spaces</li>
</ul>
<p><strong><strong>Core Data &amp; Analytics Technology Skills</strong></strong></p>
<ul>
<li>Strong programming skills in Python and SQL; PySpark experience required</li>
<li>Hands-on experience with Palantir Foundry, including:</li>
<li>Pipeline Builder / Code Workbook</li>
<li>Data integration and transformation</li>
<li>Ontology modelling and data lineage</li>
<li>Solid understanding of data architectures, including data lakes, lakehouses, and data warehouses</li>
<li>Experience working with APIs, databases, and structured / semi-structured data</li>
</ul>
<p><strong><strong>Engineering &amp; Platform Foundations</strong></strong></p>
<ul>
<li>Experience building scalable ETL/ELT pipelines</li>
<li>Familiarity with CI/CD concepts, testing, and production deployments</li>
<li>Strong focus on solution quality, maintainability, and performance</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong><strong>Nice to have</strong></strong></p>
<ul>
<li>Experience with cloud platforms (AWS, Azure, GCP)</li>
<li>Familiarity with containerisation (Docker, Kubernetes)</li>
<li>Prior experience as a Palantir FDE or in Foundry-heavy delivery roles</li>
<li>Domain experience in industries such as Energy, Finance, Public Sector, Healthcare, or Logistics</li>
</ul>
<p><strong><strong>Language &amp; Mobility</strong></strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong><strong>Benefits</strong></strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice, you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role, you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics strategy, Data Management &amp; Governance, Data Platforms &amp; engineering, Analytics &amp; Data Science.</p>
<p><strong><strong>About listing company</strong></strong></p>
<p>Infosys Consulting is a globally renowned management consulting firm that is on the front-line of industry disruption. We are a mid-size player with a supportive, entrepreneurial spirit that works with a market-leading brand in every sector, while our parent organization Infosys is a top-5 powerhouse IT brand that is outperforming the market and experiencing rapid growth.</p>
<p>Our consulting business is annually recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths we offer to our consultants. We are committed to fostering an inclusive work culture that inspires everyone to deliver their best.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, SQL, PySpark, Palantir Foundry, Pipeline Builder / Code Workbook, Data integration and transformation, Ontology modelling and data lineage, Data architectures, APIs, Databases, Structured / semi-structured data, Cloud platforms, Containerisation, Palantir FDE, Foundry-heavy delivery roles, Domain experience in industries such as Energy, Finance, Public Sector, Healthcare, or Logistics</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting is a globally renowned management consulting firm that works with market-leading brands across sectors. It is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/2u6mMfyRc8Yxg8qmvZBSMX/remote-palantir-engineer-specialist---sr.-consultant---principal-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>89cf8655-bba</externalid>
      <Title>Sr Business Solutions Analyst - Data Product Owner</Title>
      <Description><![CDATA[<p>Develop complex value-based analytics solutions for quality, business process functionality, or other strategic/data/analytic initiatives that are complex in nature.</p>
<p>Collaborate with cross-functional teams to identify and establish analytics requirements.</p>
<p>Create detailed documentation and business plans that address stakeholder needs.</p>
<p>Liaise between IT and Product Management for data solutions.</p>
<p>Trusted business partner working with business leaders and cross-functional teams to identify key challenges and work on resolutions with the goal of improving organisational performance.</p>
<p>Use advanced understanding of data elements, structures, standards typically involved in financial services transactions, reporting, and cloud-based data storage/sharing to adopt and apply standards in a way that satisfies and balances regulatory and compliance requirements with business needs.</p>
<p>Develop and implement complex enterprise-wide data integration strategies that utilise various data-marts and data warehouse applications to enhance reporting.</p>
<p>Continuously work to improve data platforms, dashboards, and reports to meet emerging and changing business demands and guidelines.</p>
<p>Consult with minimal guidance from Principal on UAT, technical specifications, strategic planning, contract design, technical support, testing updates, and trainings.</p>
<p>Manage data from multiple sources and use project management skills to complete complex projects with minimal coaching and guidance.</p>
<p>Transmit data and proactively work to ensure data quality.</p>
<p>Troubleshoot data issues and devise creative and effective ways to avoid or mitigate issues.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Warehousing, Data Architecture &amp; Design, Auto &amp; Home Insurance, SQL, Snowflake, Informatica, Power BI</Skills>
      <Category>IT</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The company has a strong 55-year heritage and deep industry expertise.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/n5t7HfGHJjo9guNgu39F28/hybrid-sr-business-solutions-analyst---data-product-owner-in-pune-at-capgemini</Applyto>
      <Location>Pune, Maharashtra, India</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>93e8d013-2c1</externalid>
      <Title>Sr Business Solutions Analyst - Data Product Owner</Title>
      <Description><![CDATA[<p>Develop complex value-based analytics solutions for quality, business process functionality, or other strategic/data/analytic initiatives. Collaborate with cross-functional teams to identify and establish analytics requirements. Create detailed documentation and business plans that address stakeholder needs. Liaise between IT and Product Management for data solutions. Work with business leaders and cross-functional teams to identify key challenges and work on resolutions to improve organisational performance. Use advanced understanding of data elements, structures, standards typically involved in financial services transactions, reporting, and cloud-based data storage/sharing to adopt and apply standards in a way that satisfies and balances regulatory and compliance requirements with business needs. Develop and implement complex enterprise-wide data integration strategies that utilise various data-marts and data warehouse applications to enhance reporting. Continuously work to improve data platforms, dashboards, and reports to meet emerging and changing business demands and guidelines. Consult with minimal guidance on UAT, technical specifications, strategic planning, contract design, technical support, testing updates, and trainings. Manage data from multiple sources and use project management skills to complete complex projects with minimal coaching and guidance. Transmit data and proactively work to ensure data quality. Troubleshoot data issues and devise creative and effective ways to avoid or mitigate issues.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Developing complex value-based analytics solutions</li>
<li>Collaborating with cross-functional teams to identify and establish analytics requirements</li>
<li>Creating detailed documentation and business plans</li>
<li>Liaising between IT and Product Management for data solutions</li>
<li>Working with business leaders and cross-functional teams to identify key challenges and work on resolutions to improve organisational performance</li>
<li>Developing and implementing complex enterprise-wide data integration strategies</li>
<li>Continuously working to improve data platforms, dashboards, and reports</li>
<li>Consulting with minimal guidance on various technical and business aspects</li>
<li>Managing data from multiple sources and using project management skills to complete complex projects</li>
<li>Troubleshooting data issues and devising creative and effective ways to avoid or mitigate issues</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Warehousing, Data Architecture &amp; Design, Auto &amp; Home Insurance, SQL, Snowflake, Informatica, Power BI</Skills>
      <Category>IT</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The company has a strong 55-year heritage and deep industry expertise.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/iUXtT6bRSaetL2aU9Hq8RU/hybrid-sr-business-solutions-analyst---data-product-owner-in-hyderabad-at-capgemini</Applyto>
      <Location>Hyderabad</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>ecdc5591-27d</externalid>
      <Title>Data Engineer</Title>
      <Description><![CDATA[<p>We are seeking a highly skilled Data Engineer to join our team. As a Data Engineer, you will play a key role in the development and maintenance of our data infrastructure, ensuring that our data is accurate, reliable, and secure.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design, develop, and maintain data pipelines and architectures to support our data-driven decision-making processes</li>
<li>Collaborate with our data scientists and analysts to understand their data requirements and develop solutions to meet those needs</li>
<li>Work closely with our IT team to ensure that our data systems are integrated with our existing infrastructure</li>
<li>Develop and maintain data quality and governance processes to ensure that our data is accurate and reliable</li>
<li>Participate in the development and maintenance of our data architecture roadmap</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Mathematics, or a related field</li>
<li>2+ years of experience in data engineering or a related field</li>
<li>Strong understanding of data engineering principles and practices</li>
<li>Experience with data warehousing and business intelligence tools</li>
<li>Strong programming skills in languages such as Python, Java, or C++</li>
<li>Experience with cloud-based data platforms such as AWS or GCP</li>
<li>Strong communication and collaboration skills</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary and benefits package</li>
<li>Opportunity to work with a leading Formula One racing team</li>
<li>Collaborative and dynamic work environment</li>
<li>Professional development and growth opportunities</li>
<li>Access to state-of-the-art technology and tools</li>
<li>Flexible working hours and remote work options</li>
</ul>
<p>Note: The salary range for this position is competitive and will be discussed during the interview process.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>Competitive and will be discussed during the interview process</Salaryrange>
      <Skills>data engineering, data warehousing, business intelligence, Python, Java, C++, AWS, GCP, cloud computing, data architecture, data governance</Skills>
      <Category>Engineering</Category>
      <Industry>Motorsport</Industry>
      <Employername>Williams Racing</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.williamsf1.com.png</Employerlogo>
      <Employerdescription>Williams Racing is a British Formula One racing team that has been in operation since 1977. The team is based in Grove, Oxfordshire.</Employerdescription>
      <Employerwebsite>https://careers.williamsf1.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.williamsf1.com/job/trackside-operations-lead-hospitality-in-london-jid-487</Applyto>
      <Location>Grove</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>989a1f92-1fb</externalid>
      <Title>Development Manager</Title>
      <Description><![CDATA[<p>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. The Data &amp; Insights team harnesses the power of data to deliver transformative insights and solutions to EA game teams and players.</p>
<p>As a Development Manager supporting the Enterprise Data Engineering team, you will play an important role in helping turn complex data initiatives into reliable, high-impact outcomes. Working closely with engineering managers, data engineers, and project and program management peers, you will support the planning, coordination, and delivery of modern data architectures, platforms, pipelines, and products that power analytics and inform both day-to-day and strategic decision-making across our studios and corporate stakeholders.</p>
<p>Project Management:</p>
<ul>
<li>Oversee project schedules and execution ensuring timely delivery of milestones.</li>
<li>Balance scope, timelines, and overall deliverable or project quality.</li>
<li>Track dependencies, identify and mitigate risks, manage blockers, prioritize tasks, and ensure delivery commitments and project goals are met.</li>
<li>Ensure timely and effective reporting to leadership and stakeholders, escalating and seeking timely support when necessary.</li>
</ul>
<p>Team Coordination and Collaboration:</p>
<ul>
<li>Collaborate with cross-functional teams and stakeholders, including engineers, product managers, and other project/program managers.</li>
<li>Conduct regular meetings and Scrum ceremonies, such as sprint planning sessions, recurring stand-ups, and retrospectives to continually align priorities with team delivery efforts.</li>
<li>Communicate status and progress updates effectively following established processes and channels.</li>
<li>Support capacity planning, identifying and communicating resource constraints or challenges, and managing change requests to minimize delivery disruptions.</li>
<li>Identify challenges and propose opportunities to level up processes, engagement, reporting, and communication.</li>
</ul>
<p>WHAT YOU WILL BRING</p>
<ul>
<li>2+ years of relevant project management experience in a technical or data-driven environment.</li>
<li>Bachelor’s degree in a relevant field (e.g., business, management, information systems, engineering).</li>
<li>Demonstrated experience with project management methodologies like Scrum, Agile, Kanban and related tools (JaaS/Jira); Scrum or Agile certification a plus.</li>
<li>Proven ability to work with cross-functional teams and stakeholders to align on deliverables, timelines, break work into executable tasks, effort sizing, and support conflict resolution when appropriate.</li>
<li>Strong communicator: comfortable facilitating meetings, leading discussions, coordinating and bridging gaps between teams, talking through solutions and distilling information, and reporting to leadership.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>project management, Scrum, Agile, Kanban, JaaS/Jira, data engineering, data architecture, data analytics, communication, team collaboration, problem-solving, time management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts is a leading video game developer and publisher with a portfolio of games and experiences across various platforms.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Development-Manager/212690</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>6ea8846c-bf3</externalid>
      <Title>Data Engineer</Title>
      <Description><![CDATA[<p><strong>Data Engineer</strong></p>
<p>We are seeking a highly skilled Data Engineer to join our Data and Analytics team. As a Data Engineer, you will play a key role in the development and maintenance of our data infrastructure, ensuring that our data is accurate, reliable, and easily accessible to our teams.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design, develop, and maintain data pipelines and architectures to support our data-driven decision-making processes</li>
<li>Collaborate with cross-functional teams to identify data requirements and develop solutions to meet those needs</li>
<li>Work closely with our data scientists to ensure that our data is accurate, complete, and easily accessible</li>
<li>Develop and maintain data visualizations and reports to support our business needs</li>
<li>Troubleshoot data-related issues and implement solutions to prevent future occurrences</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Mathematics, or a related field</li>
<li>2+ years of experience in data engineering or a related field</li>
<li>Strong understanding of data structures, algorithms, and software design patterns</li>
<li>Experience with data warehousing and business intelligence tools</li>
<li>Strong programming skills in languages such as Python, Java, or C++</li>
<li>Experience with cloud-based data platforms such as AWS or GCP</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary and benefits package</li>
<li>Opportunity to work with a leading Formula One team</li>
<li>Collaborative and dynamic work environment</li>
<li>Professional development opportunities</li>
</ul>
<p><strong>What We Offer</strong></p>
<ul>
<li>A competitive salary and benefits package</li>
<li>The opportunity to work with a leading Formula One team</li>
<li>A collaborative and dynamic work environment</li>
<li>Professional development opportunities</li>
</ul>
<p><strong>How to Apply</strong></p>
<p>If you are a motivated and experienced Data Engineer looking for a new challenge, please submit your application, including your CV and a cover letter, to [insert contact email or link to application portal].</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, data pipelines, data architecture, data visualization, data warehousing, business intelligence, Python, Java, C++, AWS, GCP, cloud computing, big data, machine learning</Skills>
      <Category>Engineering</Category>
      <Industry>Motorsport</Industry>
      <Employername>Williams Racing</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.williamsf1.com.png</Employerlogo>
      <Employerdescription>Williams Racing is a British Formula One racing team that designs, manufactures, and races Formula One cars. The team is based in Grove, Oxfordshire.</Employerdescription>
      <Employerwebsite>https://careers.williamsf1.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.williamsf1.com/job/executive-office-coordinator-in-grove-wantage-jid-491</Applyto>
      <Location>Grove</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>1e66d068-858</externalid>
      <Title>Strategic Finance, Hardware R&amp;D Finance Manager</Title>
      <Description><![CDATA[<p><strong>Compensation</strong></p>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
</ul>
<ul>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
</ul>
<ul>
<li>401(k) retirement plan with employer match</li>
</ul>
<ul>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
</ul>
<ul>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
</ul>
<ul>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
</ul>
<ul>
<li>Mental health and wellness support</li>
</ul>
<ul>
<li>Employer-paid basic life and disability coverage</li>
</ul>
<ul>
<li>Annual learning and development stipend to fuel your professional growth</li>
</ul>
<ul>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
</ul>
<ul>
<li>Relocation support for eligible employees</li>
</ul>
<ul>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p><strong>About the Team</strong></p>
<p>The Strategic Finance team provides financial insights and guidance to support the organization&#39;s long-term goals and strategies. We partner across the business to allocate and deploy our resources for the highest impact outcomes.</p>
<p><strong>About the Role</strong></p>
<p>We are hiring a Hardware Finance Manager, R&amp;D to own the financial strategy, investment modeling, and long-range planning for our hardware R&amp;D programs. This role sits at the intersection of Hardware Engineering, Product, and Finance, and is responsible for ensuring that R&amp;D roadmap decisions—spanning technical scope, sequencing, resourcing, and timelines—are grounded in rigorous financial analysis and disciplined capital allocation.</p>
<p>This role will serve as the embedded finance partner to Hardware Engineering and Product teams, with end-to-end ownership of R&amp;D investment models, program-level financials, and long-range planning across the hardware development lifecycle. You will help leadership evaluate tradeoffs across technical ambition, speed, risk, and capital efficiency as hardware programs scale in scope and complexity.</p>
<p>We have a strong preference for candidates who can be based in our San Francisco HQ. We use a hybrid work model of 3 days in the office per week and offer relocation assistance to new employees.</p>
<p><strong>In this role, you will oversee multiple areas of responsibility:</strong></p>
<ul>
<li>Own end-to-end hardware R&amp;D financial models across the development lifecycle, from early concept and prototyping through development, validation, and transition to production.</li>
</ul>
<ul>
<li>Serve as a key finance partner to Hardware Engineering and Product leadership, supporting decisions on roadmap sequencing, technical scope, resourcing levels, and investment pacing through clear financial frameworks and scenario analysis.</li>
</ul>
<ul>
<li>Translate technical roadmaps into financially grounded R&amp;D execution plans, integrating assumptions around headcount ramps, staffing mix, tooling, lab and test infrastructure, and development timelines.</li>
</ul>
<ul>
<li>Drive capital discipline and investment efficiency, proactively identifying scope changes, resourcing inefficiencies, and investment risks before spend becomes structurally locked in.</li>
</ul>
<ul>
<li>Own R&amp;D forecasting, budgeting, and variance analysis, providing clear visibility into spend vs. plan, key drivers of change, and implications for broader hardware investment priorities.</li>
</ul>
<ul>
<li>Frame program-level tradeoffs and decision scenarios for leadership, clearly quantifying implications across cost, schedule, technical risk, and long-term platform value.</li>
</ul>
<ul>
<li>Build and maintain standardized R&amp;D investment dashboards and program views to provide consistent, executive-ready visibility into burn rates, milestone progress, and capital allocation.</li>
</ul>
<ul>
<li>Support portfolio-level decision-making by comparing investment profiles across hardware programs and generations, informing prioritization, sequencing, and long-term R&amp;D strategy.</li>
</ul>
<ul>
<li>Contribute to the development and scaling of the hardware R&amp;D finance foundation, improving modeling rigor, governance, and decision support as the hardware portfolio grows.</li>
</ul>
<p><strong>You might thrive in this role if you have:</strong></p>
<ul>
<li>8+ years of progressive finance experience with significant exposure to hardware, manufacturing, or complex supply chain businesses.</li>
</ul>
<ul>
<li>A passion for helping build world-class finance teams and driving business and financial outcomes, as measured on margin improvement, working capital efficiency, forecast accuracy, and execution of cost-reduction initiatives.</li>
</ul>
<ul>
<li>A strong ability to critically evaluate opportunities and risks.</li>
</ul>
<ul>
<li>Expert modeling skills with best-in-class attention to detail and unwavering commitment to accuracy.</li>
</ul>
<ul>
<li>Exemplary ability to distill complex financial information into actionable insights.</li>
</ul>
<ul>
<li>Excellent communication skills and “story telling” ability when presenting data insights.</li>
</ul>
<ul>
<li>Strong enthusiasm for building the human-computer interface for the AI era.</li>
</ul>
<p><strong>About OpenAI</strong></p>
<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full spectrum of humanity.</p>
<p>We are an equal opportunity employer, and we do not discriminate on the basis of race, religion, color, national origin, sex, sexual orientation, age, veteran status, disability, genetic information, or other applicable legally protected characteristic.</p>
<p>For additional information, please see [OpenAI’s Affirmative Action and Equal Employment Opportunity Policy Statement](https://cdn.openai.com/policies/eeo-policy-statement.pdf).</p>
<p>Background checks for applicants will be administered in accordance with applicable law, and qualified applicants with arrest or conviction records will be considered for employment consistent with those laws, including the San Francisco Fair Chance Ordinance, the Los Angeles County Fair Chance Ordinance for Employers, and the California Fair Chance Act, for US-based candidates. For unincorporated Los Angeles County workers: we reasonably believe that criminal history may have a direct, adverse and negative relationship with the following job duties, potentially resulting in the withdrawal of a conditional offer of employment: protect computer hardware entrusted to you from theft, loss or damage; return all computer hardware in your possession (including the data contained therein) upon termination of employment or end of assignment; and maintain the confidentiality of proprietary, confidential, and non-public information. In addition, job duties require access to secure and protected information.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>Full time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$234K – $325K</Salaryrange>
      <Skills>financial modeling, investment analysis, capital allocation, financial planning, forecasting, budgeting, variance analysis, financial reporting, financial analysis, financial modeling, data analysis, data visualization, financial software, Microsoft Excel, financial planning and analysis, financial modeling and analysis, financial reporting and analysis, data science, machine learning, artificial intelligence, cloud computing, data engineering, data architecture, data governance, data quality, data security, data analytics, data visualization, financial software, Microsoft Excel, financial planning and analysis, financial modeling and analysis, financial reporting and analysis</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/9548776c-d623-4a63-af2c-c3a4a1d9685f</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>99247ee9-652</externalid>
      <Title>Data Scientist</Title>
      <Description><![CDATA[<p>We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will play a key role in helping us to make data-driven decisions and drive business growth.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Work closely with the data engineering team to design, develop and maintain data pipelines and architectures</li>
<li>Collaborate with cross-functional teams to identify business opportunities and develop data-driven solutions</li>
<li>Develop and maintain machine learning models to drive business growth and improve customer experience</li>
<li>Analyse large datasets to identify trends and insights that can inform business decisions</li>
<li>Communicate complex data insights to non-technical stakeholders</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Mathematics, Statistics or a related field</li>
<li>2+ years of experience in data science or a related field</li>
<li>Strong programming skills in languages such as Python, R or SQL</li>
<li>Experience with machine learning libraries such as scikit-learn or TensorFlow</li>
<li>Strong analytical and problem-solving skills</li>
<li>Excellent communication and interpersonal skills</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary and benefits package</li>
<li>Opportunity to work with a leading Formula One team</li>
<li>Collaborative and dynamic work environment</li>
<li>Professional development and growth opportunities</li>
<li>Access to cutting-edge technology and tools</li>
<li>Flexible working hours and remote work options</li>
</ul>
<p>If you are a motivated and talented Data Scientist looking for a new challenge, please submit your application. We look forward to hearing from you!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, R, SQL, scikit-learn, TensorFlow, machine learning, data engineering, data pipelines, data architectures, data visualisation, data storytelling, data communication</Skills>
      <Category>Engineering</Category>
      <Industry>Motorsport</Industry>
      <Employername>Williams Racing</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.williamsf1.com.png</Employerlogo>
      <Employerdescription>Williams Racing is a British Formula One racing team. The team is one of the most successful and recognised teams in the history of the sport, with a rich heritage dating back to 1977.</Employerdescription>
      <Employerwebsite>https://careers.williamsf1.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.williamsf1.com/job/simulation-delivery-manager-in-grove-wantage-jid-513</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-07</Postedate>
    </job>
    <job>
      <externalid>c0ccd7e3-4cb</externalid>
      <Title>Data Scientist, Preparedness</Title>
      <Description><![CDATA[<p><strong>Job Posting</strong></p>
<p><strong>Data Scientist, Preparedness</strong></p>
<p><strong>Location</strong></p>
<p>San Francisco</p>
<p><strong>Employment Type</strong></p>
<p>Full time</p>
<p><strong>Department</strong></p>
<p>Data Science</p>
<p><strong>Compensation</strong></p>
<ul>
<li>$347K – $400K • Offers Equity</li>
</ul>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
</ul>
<ul>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
</ul>
<ul>
<li>401(k) retirement plan with employer match</li>
</ul>
<ul>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
</ul>
<ul>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
</ul>
<ul>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
</ul>
<ul>
<li>Mental health and wellness support</li>
</ul>
<ul>
<li>Employer-paid basic life and disability coverage</li>
</ul>
<ul>
<li>Annual learning and development stipend to fuel your professional growth</li>
</ul>
<ul>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
</ul>
<ul>
<li>Relocation support for eligible employees</li>
</ul>
<ul>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p>More details about our benefits are available to candidates during the hiring process.</p>
<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>
<p><strong>About the Team</strong></p>
<p>The Preparedness team is an important part of the Safety Systems org at OpenAI, and is guided by OpenAI’s Preparedness Framework.</p>
<p>Frontier AI models have the potential to benefit all of humanity, but also pose increasingly severe risks. To ensure that AI promotes positive change, the Preparedness team helps us prepare for the development of increasingly capable frontier AI models. This team is tasked with identifying, tracking, and preparing for catastrophic risks related to frontier AI models.</p>
<p>The mission of the Preparedness team is to:</p>
<ol>
<li>Closely monitor and predict the evolving capabilities of frontier AI systems, with an eye towards misuse risks whose impact could be catastrophic to our society</li>
</ol>
<ol>
<li>Ensure we have concrete procedures, infrastructure and partnerships to mitigate these risks and to safely handle the development of powerful AI systems</li>
</ol>
<p>Preparedness tightly connects capability assessment, evaluations, and internal red teaming, and mitigations for frontier models, as well as overall coordination on AGI preparedness. This is fast paced, exciting work that has far reaching importance for the company and for society.</p>
<p><strong>About the Role</strong></p>
<p>We’re hiring a Data Scientist to help build, evaluate, and continuously improve mitigations that prevent extreme harms from AI systems. This role is for an experienced, highly autonomous individual contributor who can take ambiguous problem statements, structure rigorous analyses, and translate findings into actionable product and policy changes.</p>
<p>This position goes beyond “running evals.” You’ll help create mitigation intelligence and monitoring systems that enable OpenAI to detect issues early, measure effectiveness over time, and reduce both over-blocking (unnecessary friction) and under-blocking (missed harm).</p>
<p><strong>What You’ll Do</strong></p>
<ul>
<li>Evaluate and improve mitigation systems, including classifiers and detection pipelines across domains (e.g., biosecurity, cybersecurity, and emerging risk areas).</li>
</ul>
<ul>
<li>Diagnose false positives and false negatives with deep error analysis, root cause investigation, and clear recommendations for mitigation adjustments.</li>
</ul>
<ul>
<li>Build monitoring and measurement frameworks to track mitigation effectiveness over time and across user segments and use cases.</li>
</ul>
<ul>
<li>Identify trends in over-blocking vs. under-blocking, quantify customer impact, and propose prioritized interventions.</li>
</ul>
<ul>
<li>Develop insights from customer feedback, complaints, and usage patterns to detect shifts in adversarial behavior and system failure modes.</li>
</ul>
<ul>
<li>Expand risk monitoring into new areas, including cybersecurity threats and model loss-of-control or sabotage scenarios, in partnership with domain experts.</li>
</ul>
<ul>
<li>Communicate results to technical and executive stakeholders with crisp narratives, decision-ready metrics, and clear tradeoffs.</li>
</ul>
<p><strong>You might thrive in this role if you are:</strong></p>
<ul>
<li>An autonomous operator: you can take a problem statement and independently structure the analysis end-to-end.</li>
</ul>
<ul>
<li>Strong at executive-ready communication: concise, clear, and outcome-oriented.</li>
</ul>
<ul>
<li>Skilled in turning analysis into productable changes: you’re comfortable influencing across functions to drive mitigation improvements.</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>Significant experience in data science or applied analytics in high-stakes domains (e.g., security, trust &amp; safety, abuse prevention, fraud, platform integrity, or reliability).</li>
</ul>
<ul>
<li>Strong foundations in experimentation, causal thinking, and/or observational inference; ability to design robust measurement under imperfect data.</li>
</ul>
<ul>
<li>Fluency in SQL and Python (or equivalent) for analysis, modeling, and building monitoring workflows.</li>
</ul>
<ul>
<li>Experience building metrics, dashboards, and operational monitoring that meaningfully changes outcomes (not just reporting).</li>
</ul>
<ul>
<li>Track record of driving cross-functional impact with engineering, product, and research partners</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$347K – $400K • Offers Equity</Salaryrange>
      <Skills>data science, applied analytics, security, trust &amp; safety, abuse prevention, fraud, platform integrity, reliability, SQL, Python, experimentation, causal thinking, observational inference, measurement, metrics, dashboards, operational monitoring, machine learning, deep learning, natural language processing, computer vision, data engineering, data architecture</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is a technology company that focuses on developing and applying artificial intelligence in a way that benefits humanity. It was founded in 2015 and has since grown to become one of the leading AI research and development companies in the world.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/efcc3430-14c8-4022-8350-8146ffb867ab</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>448a56f3-ab5</externalid>
      <Title>Director of Data Engineering and Agentic AI Automation, Finance</Title>
      <Description><![CDATA[<p><strong>Director of Data Engineering and Agentic AI Automation, Finance</strong></p>
<p><strong>Location</strong></p>
<p>San Francisco</p>
<p><strong>Employment Type</strong></p>
<p>Full time</p>
<p><strong>Department</strong></p>
<p>Finance</p>
<p><strong>Compensation</strong></p>
<ul>
<li>$347K – $490K • Offers Equity</li>
</ul>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<p><strong>Benefits</strong></p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
</ul>
<ul>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
</ul>
<ul>
<li>401(k) retirement plan with employer match</li>
</ul>
<ul>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
</ul>
<ul>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
</ul>
<ul>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
</ul>
<ul>
<li>Mental health and wellness support</li>
</ul>
<ul>
<li>Employer-paid basic life and disability coverage</li>
</ul>
<ul>
<li>Annual learning and development stipend to fuel your professional growth</li>
</ul>
<ul>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
</ul>
<ul>
<li>Relocation support for eligible employees</li>
</ul>
<ul>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p><strong>About the Team</strong></p>
<p>We are looking for a Director of Data Engineering and Agentic AI Automation to lead the next generation of our finance data infrastructure. As OpenAI expands its Finance operations, we need scalable and trustworthy data systems to match the pace and complexity of our growth. This includes well-modeled, auditable data for revenue recognition, financial reporting, and planning, supported by reliable pipelines that connect ERP, planning, and operational systems. You will lead a group of analytics engineers, data engineers, and AI engineers to build the data pipelines that connect our internal engineering systems with enterprise platforms such as Oracle Fusion ERP. This role will also define the roadmap for agentic AI automation, enabling intelligent workflows, process automation, and AI-driven decision-making across Finance.</p>
<p><strong>In this role, you will:</strong></p>
<ul>
<li>Build and maintain scalable, auditable data infrastructure that powers accurate financial information, with a focus on revenue recognition, compute attribution, and close automation.</li>
</ul>
<ul>
<li>Lead and grow teams of analytics engineers, data engineers, and AI engineers to deliver high-impact, intelligent data systems.</li>
</ul>
<ul>
<li>Guide work across financial close and allocations automation, B2C revenue automation from engineering systems to ERP (including reconciliation with cash and source systems), and other mission-critical financial processes.</li>
</ul>
<ul>
<li>Design and implement data pipelines connecting ERP, planning, and operational systems, including Oracle Fusion, Anaplan, and Workday.</li>
</ul>
<ul>
<li>Build and support scalable, audit-proof architecture that enables reliable financial reporting and compliance.</li>
</ul>
<ul>
<li>Develop data and AI-powered workflows that enhance forecasting accuracy, compliance automation, and operational efficiency.</li>
</ul>
<ul>
<li>Create and maintain data marts and products that support stakeholders across Revenue, FP&amp;A, Tax, Procurement, Hardware Accounting, and Controller teams.</li>
</ul>
<ul>
<li>Define and enforce best practices for data modeling, lineage, observability, and reconciliation across finance data domains.</li>
</ul>
<ul>
<li>Set the technical direction and manage team structure, mentoring engineers and overseeing contractors or system integrators to ensure delivery of high-quality outcomes.</li>
</ul>
<ul>
<li>Partner with senior leaders across Finance, Engineering, and Infrastructure to align on priorities and integrate new automation capabilities.</li>
</ul>
<ul>
<li>Ensure data systems are AI-ready and capable of supporting predictive analytics, autonomous agent workflows, and large-scale automation.</li>
</ul>
<ul>
<li>Own and maintain Tier-1 data pipelines with strict SLA, data quality, and compliance standards.</li>
</ul>
<ul>
<li>Drive the long-term roadmap for agentic AI enablement to build the foundation for “Finance on OpenAI.”</li>
</ul>
<p><strong>You might thrive in this role if you have:</strong></p>
<ul>
<li>12+ years in data engineering, with proven experience building and managing enterprise-scale, auditable ETL pipelines and complex datasets</li>
</ul>
<ul>
<li>Proficiency in SQL and Python, with demonstrated experience in schema design, data modeling, and orchestration frameworks</li>
</ul>
<ul>
<li>Expertise in distributed data processing technologies such as Apache Spark, Kafka, and cloud-native storage (e.g., S3, ADLS)</li>
</ul>
<ul>
<li>Deep knowledge of enterprise data architecture, especially within Finance and Supply Chain</li>
</ul>
<ul>
<li>Familiarity with financial processes (close, allocations, revenue recognition) and supply chain data models (Supply and demand planning, procurement, vendor master), along with experience in ingesting data from internal engineering systems with large volumes of B2C</li>
</ul>
<ul>
<li>Experience integrating with contract manufacturers and external logistics providers is a strong plus</li>
</ul>
<ul>
<li>Strong track record of partnering with senior business stakeholders</li>
</ul>
<p><strong>Work Environment</strong></p>
<p>This role is based in San Francisco, CA. We use a hybrid work model of 3 days in the office per week and offer relocation assistance to new employees.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$347K – $490K • Offers Equity</Salaryrange>
      <Skills>SQL, Python, Apache Spark, Kafka, cloud-native storage, data modeling, orchestration frameworks, distributed data processing technologies, enterprise data architecture, financial processes, supply chain data models, ETL pipelines, complex datasets, schema design, data engineering, data infrastructure, auditable data, revenue recognition, financial reporting, planning, ERP, planning, operational systems, Oracle Fusion, Anaplan, Workday, data marts, products, stakeholders, Revenue, FP&amp;A, Tax, Procurement, Hardware Accounting, Controller, data modeling, lineage, observability, reconciliation, finance data domains, team structure, engineers, contractors, system integrators, predictive analytics, autonomous agent workflows, large-scale automation</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is a technology company that specializes in artificial intelligence. It was founded in 2015 and is headquartered in San Francisco, California.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/e84e7b7e-a82e-411e-929a-615dc3080280</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>92fce934-b53</externalid>
      <Title>Data Scientist</Title>
      <Description><![CDATA[<p><strong>Apply now!  First name\<em>  Last name\</em>  Email address\<em>  Which career you want to apply to?\</em>  Message\<em>  + FP  Read more  ## Job Description  We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will be responsible for analysing large datasets to gain insights and inform our racing strategy.  ### Responsibilities  </em> Develop and implement data analysis and machine learning models to improve our racing performance <em> Work closely with our engineering and racing teams to understand their needs and develop solutions </em> Collaborate with our data engineers to design and implement data pipelines and architectures <em> Develop and maintain data visualisation tools to communicate insights to our teams </em> Stay up-to-date with the latest developments in data science and machine learning <em> Work with our data engineers to ensure data quality and integrity </em> Develop and maintain data documentation and standards <em> Collaborate with our racing teams to develop and implement data-driven strategies </em> Work with our data engineers to develop and implement data-driven decision-making tools <em> Develop and maintain data visualisation tools to communicate insights to our teams </em> Stay up-to-date with the latest developments in data science and machine learning <em> Work with our data engineers to ensure data quality and integrity </em> Develop and maintain data documentation and standards  ### Requirements  <em> Bachelor&#39;s degree in Computer Science, Mathematics, Statistics, or a related field </em> 2+ years of experience in data science or a related field <em> Strong programming skills in Python, R, or SQL </em> Experience with machine learning libraries such as scikit-learn, TensorFlow, or PyTorch <em> Experience with data visualisation tools such as Matplotlib, Seaborn, or Plotly </em> Strong understanding of statistical concepts and techniques <em> Experience with data engineering and data architecture </em> Strong communication and collaboration skills <em> Ability to work in a fast-paced environment  ### Benefits  </em> Competitive salary and benefits package <em> Opportunity to work with a professional motorsport organisation </em> Collaborative and dynamic work environment <em> Opportunities for professional growth and development </em> Access to cutting-edge technology and tools * Flexible working hours and remote work options  ## How to Apply  If you are a motivated and talented Data Scientist looking for a new challenge, please submit your application, including your resume and a cover letter, to [insert contact information]. We look forward to hearing from you!</strong></p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>Competitive salary and benefits package</Salaryrange>
      <Skills>Python, R, SQL, Machine learning, Data visualisation, Statistical concepts, Data engineering, Data architecture, TensorFlow, PyTorch, Matplotlib, Seaborn, Plotly</Skills>
      <Category>Engineering</Category>
      <Industry>Motorsport</Industry>
      <Employername>W Racing Team</Employername>
      <Employerlogo>https://logos.yubhub.co/w-racingteam.com.png</Employerlogo>
      <Employerdescription>W Racing Team is a professional motorsport organisation that competes in various international racing series. The team has a strong presence in the FIA World Endurance Championship and the IMSA WeatherTech SportsCar Championship.</Employerdescription>
      <Employerwebsite>https://www.w-racingteam.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://www.w-racingteam.com/manufacturing/careers/mécano</Applyto>
      <Location>Monza</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>4a7597fd-d7a</externalid>
      <Title>Senior Data Engineer</Title>
      <Description><![CDATA[<p>Joining Razer will place you on a global mission to revolutionize the way the world games. Razer is a place to do great work, offering you the opportunity to make an impact globally while working across a global team located across 5 continents. Razer is also a great place to work, providing you the unique, gamer-centric #LifeAtRazer experience that will put you in an accelerated growth, both personally and professionally.</p>
<p><strong>What you&#39;ll do</strong></p>
<p>We are looking for a Senior Data Engineer to lead the technical initiatives for AI Data Engineering, enabling scalable, high-performance data pipelines that power AI and machine learning applications. This role will focus on architecting, optimizing, and managing data infrastructure to support AI model training, feature engineering, and real-time inference. You will collaborate closely with AI/ML engineers, data scientists, and platform teams to build the next generation of AI-driven products.</p>
<ul>
<li>Lead AI Data Engineering initiatives by driving the design and development of robust data pipelines for AI/ML workloads, ensuring efficiency, scalability, and reliability.</li>
<li>Design and implement data architectures that support AI model training, including feature stores, vector databases, and real-time streaming solutions.</li>
<li>Develop high performance data pipelines that process structured, semi-structured, and unstructured data at scale, supporting the various AI applications</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>Hands on experience working with Vector/Graph;Neo4j</li>
<li>3+ years of experience in data engineering, working on AI/ML-driven data architectures</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Hands on experience working with Vector/Graph;Neo4j, 3+ years of experience in data engineering, working on AI/ML-driven data architectures, Python, SQL, Experience in developing and deploying applications running on cloud infrastructure such as AWS, Azure or Google Cloud Platform using Infrastructure as code tools such as Terraform, containerization tools like Dockers, container orchestration platforms like Kubernetes, Experience using orchestration tools like Airflow or Prefect, distributed computing framework like Spark or Dask, data transformation tool like Data Build Tool (DBT), Excellent with various data processing techniques (both streaming and batch), managing and optimizing data storage (Data Lake, Lake House and Database, SQL, and NoSQL) is essential.</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Razer</Employername>
      <Employerlogo>https://logos.yubhub.co/razer.com.png</Employerlogo>
      <Employerdescription>Razer is a global company that creates cutting-edge products and experiences that define the ultimate gameplay. They are guided by their mission &apos;For Gamers. By Gamers.&apos; and are relentlessly pushing boundaries and leading the charge in AI for gaming, shaping the future of the industry.</Employerdescription>
      <Employerwebsite>https://razer.wd3.myworkdayjobs.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://razer.wd3.myworkdayjobs.com/en-US/Careers/job/Singapore/Senior-Data-Engineer_JR2025005485</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-01-01</Postedate>
    </job>
    <job>
      <externalid>e5eb908e-6f9</externalid>
      <Title>Senior Data Engineer</Title>
      <Description><![CDATA[<p>We are looking for a Senior Data Engineer to lead the technical initiatives for AI Data Engineering, enabling scalable, high-performance data pipelines that power AI and machine learning applications. This role will focus on architecting, optimizing, and managing data infrastructure to support AI model training, feature engineering, and real-time inference.</p>
<p><strong>What you&#39;ll do</strong></p>
<p>We are looking for a Senior Data Engineer to lead the technical initiatives for AI Data Engineering, enabling scalable, high-performance data pipelines that power AI and machine learning applications. This role will focus on architecting, optimizing, and managing data infrastructure to support AI model training, feature engineering, and real-time inference.</p>
<ul>
<li>Lead AI Data Engineering initiatives by driving the design and development of robust data pipelines for AI/ML workloads, ensuring efficiency, scalability, and reliability.</li>
<li>Design and implement data architectures that support AI model training, including feature stores, vector databases, and real-time streaming solutions.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>Hands on experience working with Vector/Graph;Neo4j</li>
<li>3+ years of experience in data engineering, working on AI/ML-driven data architectures</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Vector/Graph;Neo4j, data engineering, AI/ML-driven data architectures, Python, SQL, Terraform, containerization tools like Dockers, container orchestration platforms like Kubernetes, orchestration tools like Airflow or Prefect, distributed computing framework like Spark or Dask, data transformation tool like Data Build Tool (DBT)</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Razer</Employername>
      <Employerlogo>https://logos.yubhub.co/razer.com.png</Employerlogo>
      <Employerdescription>Razer is a global leader in the gaming industry, dedicated to creating cutting-edge products and experiences that define the ultimate gameplay. With a mission to revolutionize the way the world games, Razer is a place to do great work, offering opportunities to make an impact globally while working across a global team located across 5 continents.</Employerdescription>
      <Employerwebsite>https://razer.wd3.myworkdayjobs.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://razer.wd3.myworkdayjobs.com/en-US/Careers/job/Singapore/Senior-Data-Engineer_JR2025005485</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2025-12-26</Postedate>
    </job>
  </jobs>
</source>