<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>0da192c7-2d3</externalid>
      <Title>Planning Analytics Developer</Title>
      <Description><![CDATA[<p>As a Planning Analytics Developer at MHP, you will design, develop, and enhance IBM Planning Analytics Workspace (PAW) dashboards, books, reports, and interactive planning applications. You will build and optimize Planning Analytics for Excel (PAX) reports, input templates, and management reporting books to support financial and operational planning processes. You will also develop and maintain TurboIntegrator (TI) processes for data loading, data transformation, master data maintenance, and automation workflows. Additionally, you will create, enhance, and troubleshoot TM1 Rules and Feeders to ensure performant and scalable cube calculations. You will collaborate with business stakeholders to translate planning and reporting requirements into user-friendly PAW/PAX solutions. You will support backend model improvements, including dimension design, cube structures, security concepts, and performance optimization. You will participate in issue resolution, knowledge sharing, and continuous improvements of the Planning Analytics landscape.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Design, develop, and enhance PAW dashboards, books, reports, and interactive planning applications</li>
<li>Build and optimize PAX reports, input templates, and management reporting books</li>
<li>Develop and maintain TI processes for data loading, data transformation, master data maintenance, and automation workflows</li>
<li>Create, enhance, and troubleshoot TM1 Rules and Feeders</li>
<li>Collaborate with business stakeholders to translate planning and reporting requirements into user-friendly PAW/PAX solutions</li>
<li>Support backend model improvements, including dimension design, cube structures, security concepts, and performance optimization</li>
<li>Participate in issue resolution, knowledge sharing, and continuous improvements of the Planning Analytics landscape</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>Strong hands-on experience with IBM Planning Analytics Workspace (PAW) development</li>
<li>Very good experience in IBM Planning Analytics for Excel (PAX) reporting and input templates</li>
<li>Solid backend expertise in IBM TM1 / IBM Planning Analytics</li>
<li>Strong knowledge of TurboIntegrator (TI) scripting</li>
<li>Practical experience with Rules, Feeders, and cube performance tuning</li>
<li>Good understanding of dimension hierarchies, subsets, MDX, and cube design</li>
<li>Experience with data integration, file-based loads, and process automation</li>
<li>Experience with PowerShell / batch scripting for file automation (nice-to-have)</li>
<li>Ability to work on both frontend user experience and backend technical logic</li>
<li>Strong attention to detail</li>
<li>Strong communication skills with both technical and business stakeholders</li>
<li>Ability to translate business requirements into intuitive planning solutions</li>
<li>High level of ownership and autonomy</li>
<li>Analytical and problem-solving mindset</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>Competitive salary and benefits package</Salaryrange>
      <Skills>IBM Planning Analytics Workspace (PAW), IBM Planning Analytics for Excel (PAX), TurboIntegrator (TI), TM1 Rules and Feeders, Dimension design, Cube structures, Security concepts, Performance optimization, PowerShell / batch scripting, Data integration, File-based loads, Process automation</Skills>
      <Category>IT</Category>
      <Industry>Consulting</Industry>
      <Employername>MHP</Employername>
      <Employerlogo>https://logos.yubhub.co/mhp.com.png</Employerlogo>
      <Employerdescription>MHP is a technology and business partner that digitizes its customers&apos; processes and products, supporting them in their IT transformations along the entire value chain. It serves over 300 customers worldwide, including leading corporations and innovative medium-sized companies.</Employerdescription>
      <Employerwebsite>https://www.mhp.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.porsche.com/index.php?ac=jobad&amp;id=20293</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-04-22</Postedate>
    </job>
    <job>
      <externalid>ac5d007b-341</externalid>
      <Title>Senior Finance and Planning Analyst</Title>
      <Description><![CDATA[<p>Join our team as a Senior Finance and Planning Analyst and contribute to the development of financial and planning processes for our customers. As a key member of our team, you will work closely with various stakeholders to analyze customer requirements and translate them into technical specifications.  We are looking for a highly skilled and experienced professional with expertise in IBM Planning Analytics (TM1) and a strong background in finance and planning. Your main responsibilities will include:  - Developing and implementing frontend solutions in Planning Analytics Workspace (PAW) and Planning Analytics for Excel (PAfE) to support data-driven decision-making - Building and maintaining backend components such as TM1 models, cubes, rules, and TurboIntegrator processes - Developing and integrating data pipelines and interfaces (ETL, APIs, middleware) and connecting to databases (e.g., SQL Server, Oracle) - Collaborating with different stakeholders (customer, project team, departments) to take up and translate customer requirements into technical specifications  As a Senior Finance and Planning Analyst, you will have the opportunity to work on challenging and international projects and contribute to the growth and success of our organization.  Key qualifications:  - Bachelor&#39;s degree and at least 2 years of experience with IBM Planning Analytics with a focus on frontend and backend development - Passion for TM1 architectures, components, deployment strategies, and performance tuning for large-scale applications - Expertise in developing Planning Analytics solutions - from frontend (PAW, UX/UI, dashboard design, HTML, CSS, JavaScript, React/Angular) to backend (TM1, data integration via ETL/APIs/middleware, databases like SQL Server/Oracle, and scripting with TurboIntegrator, rules, SQL, TM1py) - Analytical thinking, pragmatic approach, teamwork, and customer orientation  Additional information:  - Start date: to be agreed upon - always at the beginning of a month - Working hours: full-time (40 hours) &amp; 30 days of annual leave - Employment contract: unlimited - Field: consulting - Language: secure German and English - Flexibility and willingness to travel - Other: valid work permit; if necessary, we can apply for a work permit within our recruitment process. The procedure takes time and affects the start date  At MHP, you will grow continuously in a innovative and supportive environment. We are the perfect sparring partner for your career. For professional input and networking opportunities, we offer:  - Appreciation: we support and appreciate colleagues as they are and celebrate our successes together - Creativity: we welcome creativity and new impulses - Flexibility: time-wise and location-wise - depending on the project, at home, in the office, or at the customer&#39;s site - Growth opportunities: with us, you have the chance to grow in tasks, knowledge, and responsibility  Please submit your application as soon as possible. You can easily send your application documents, such as your resume, certificates, and possibly project lists, in just a few clicks through our Job Locator. A cover letter is not required.  By the way: when your application reaches us, our recruiting team checks whether there is a suitable position for you. Independently of current job postings, we try to find the right job for you at MHP.  If you have any questions, please check our FAQs on our Career page.  For further inquiries, please contact our Recruitment Team at +49 (0)7141 7856 1600.  Back to search results  ### Share with your Social Media Account  Click on one of the following icons:  {&quot;Criterion&quot;:&quot;PublicationStartDate&quot;,&quot;Direction&quot;:&quot;DESC&quot;}  ---  JSON RESPONSE:  {   &quot;title&quot;: &quot;Senior Finance and Planning Analyst&quot;,   &quot;company&quot;: &quot;MHP&quot;,   &quot;companyBackground&quot;: &quot;MHP is a technology and business partner that digitalizes processes and products for its customers and accompanies them in their IT transformations along the entire value chain.&quot;,   &quot;companyWebsite&quot;: &quot;https://www.mhp.com&quot;,   &quot;location&quot;: &quot;&quot;,   &quot;description&quot;: &quot;Join our team as a Senior Finance and Planning Analyst and contribute to the development of financial and planning processes for our customers. As a key member of our team, you will work closely with various stakeholders to analyze customer requirements and translate them into technical specifications.\n\nWe are looking for a highly skilled and experienced professional with expertise in IBM Planning Analytics (TM1) and a strong background in finance and planning. Your main responsibilities will include:\n\n- Developing and implementing frontend solutions in Planning Analytics Workspace (PAW) and Planning Analytics for Excel (PAfE) to support data-driven decision-making\n- Building and maintaining backend components such as TM1 models, cubes, rules, and TurboIntegrator processes\n- Developing and integrating data pipelines and interfaces (ETL, APIs, middleware) and connecting to databases (e.g., SQL Server, Oracle)\n- Collaborating with different stakeholders (customer, project team, departments) to take up and translate customer requirements into technical specifications\n\nAs a Senior Finance and Planning Analyst, you will have the opportunity to work on challenging and international projects and contribute to the growth and success of our organization.\n\nKey qualifications:\n\n- Bachelor&#39;s degree and at least 2 years of experience with IBM Planning Analytics with a focus on frontend and backend development\n- Passion for TM1 architectures, components, deployment strategies, and performance tuning for large-scale applications\n- Expertise in developing Planning Analytics solutions - from frontend (PAW, UX/UI, dashboard design, HTML, CSS, JavaScript, React/Angular) to backend (TM1, data integration via ETL/APIs/middleware, databases like SQL Server/Oracle, and scripting with TurboIntegrator, rules, SQL, TM1py)\n- Analytical thinking, pragmatic approach, teamwork, and customer orientation\n\nAdditional information:\n\n- Start date: to be agreed upon - always at the beginning of a month\n- Working hours: full-time (40 hours) &amp; 30 days of annual leave\n- Employment contract: unlimited\n- Field: consulting\n- Language: secure German and English\n- Flexibility and willingness to travel\n- Other: valid work permit; if necessary, we can apply for a work permit within our recruitment process. The procedure takes time and affects the start date\n\nAt MHP, you will grow continuously in a innovative and supportive environment. We are the perfect sparring partner for your career. For professional input and networking opportunities, we offer:\n\n- Appreciation: we support and appreciate colleagues as they are and celebrate our successes together\n- Creativity: we welcome creativity and new impulses\n- Flexibility: time-wise and location-wise - depending on the project, at home, in the office, or at the customer&#39;s site\n- Growth opportunities: with us, you have the chance to grow in tasks, knowledge, and responsibility\n\nPlease submit your application as soon as possible. You can easily send your application documents, such as your resume, certificates, and possibly project lists, in just a few clicks through our Job Locator. A cover letter is not required.\n\nBy the way: when your application reaches us, our recruiting team checks whether there is a suitable position for you. Independently of current job postings, we try to find the right job for you at MHP.\n\nIf you have any questions, please check our FAQs on our Career page.\n\nFor further inquiries, please contact our Recruitment Team at +49 (0)7141 7856 1600.\n\nBack to search results\n\n### Share with your Social Media Account\n\nClick on one of the following icons:</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>Competitive salary&quot;,   &quot;salaryMin&quot;: 95000,   &quot;salaryMax&quot;: 110000,   &quot;salaryCurrency&quot;: &quot;EUR&quot;,   &quot;salaryPeriod&quot;: &quot;year</Salaryrange>
      <Skills>IBM Planning Analytics, TM1, Frontend development, Backend development, Data integration, ETL, APIs, Middleware, Database management, SQL Server, Oracle, TurboIntegrator, Rules, SQL, TM1py</Skills>
      <Category>Finance</Category>
      <Industry>Consulting</Industry>
      <Employername>MHP</Employername>
      <Employerlogo>https://logos.yubhub.co/mhp.com.png</Employerlogo>
      <Employerdescription>MHP is a technology and business partner that digitalizes processes and products for its customers and accompanies them in their IT transformations along the entire value chain.</Employerdescription>
      <Employerwebsite>https://www.mhp.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.porsche.com/index.php?ac=jobad&amp;id=20110</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-04-22</Postedate>
    </job>
    <job>
      <externalid>7275ef33-009</externalid>
      <Title>Staff Data Engineer</Title>
      <Description><![CDATA[<p>At Bayer, we&#39;re seeking a Staff Data Engineer to join our team. As a Staff Data Engineer, you will design and lead the implementation of data flows to connect operational systems, data for analytics and business intelligence (BI) systems. You will recognize opportunities to reuse existing data flows, lead the build of data streaming systems, optimize the code to ensure processes perform optimally, and lead work on database management.</p>
<p>Communicating Between Technical and Non-Technical Colleagues</p>
<p>As a Staff Data Engineer, you will communicate effectively with technical and non-technical stakeholders, support and host discussions within a multidisciplinary team, and be an advocate for the team externally.</p>
<p>Data Analysis and Synthesis</p>
<p>You will undertake data profiling and source system analysis, present clear insights to colleagues to support the end use of the data.</p>
<p>Data Development Process</p>
<p>You will design, build and test data products that are complex or large scale, build teams to complete data integration services.</p>
<p>Data Innovation</p>
<p>You will understand the impact on the organization of emerging trends in data tools, analysis techniques and data usage.</p>
<p>Data Integration Design</p>
<p>You will select and implement the appropriate technologies to deliver resilient, scalable and future-proofed data solutions and integration pipelines.</p>
<p>Data Modeling</p>
<p>You will produce relevant data models across multiple subject areas, explain which models to use for which purpose, understand industry-recognised data modelling patterns and standards, and when to apply them, compare and align different data models.</p>
<p>Metadata Management</p>
<p>You will design an appropriate metadata repository and present changes to existing metadata repositories, understand a range of tools for storing and working with metadata, provide oversight and advice to more inexperienced members of the team.</p>
<p>Problem Resolution</p>
<p>You will respond to problems in databases, data processes, data products and services as they occur, initiate actions, monitor services and identify trends to resolve problems, determine the appropriate remedy and assist with its implementation, and with preventative measures.</p>
<p>Programming and Build</p>
<p>You will use agreed standards and tools to design, code, test, correct and document moderate-to-complex programs and scripts from agreed specifications and subsequent iterations, collaborate with others to review specifications where appropriate.</p>
<p>Technical Understanding</p>
<p>You will understand the core technical concepts related to the role, and apply them with guidance.</p>
<p>Testing</p>
<p>You will review requirements and specifications, and define test conditions, identify issues and risks associated with work, analyse and report test activities and results.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$114,400 to $171,600</Salaryrange>
      <Skills>Proficiency in programming language such as Python or Java, Experience with Big Data technologies such as Hadoop, Spark, and Kafka, Familiarity with ETL processes and tools, Knowledge of SQL and NoSQL databases, Strong understanding of relational databases, Experience with data warehousing solutions, Proficiency with cloud platforms, Expertise in data modeling and design, Experience in designing and building scalable data pipelines, Experience with RESTful APIs and data integration, Relevant certifications (e.g., GCP Certified, AWS Certified, Azure Certified), Bachelor&apos;s degree in Computer Science, Data Engineering, Information Technology, or a related field, Strong analytical and communication skills, Ability to work collaboratively in a team environment, High level of accuracy and attention to detail</Skills>
      <Category>Engineering</Category>
      <Industry>Healthcare</Industry>
      <Employername>Bayer</Employername>
      <Employerlogo>https://logos.yubhub.co/talent.bayer.com.png</Employerlogo>
      <Employerdescription>Bayer is a multinational pharmaceutical and life sciences company that develops and manufactures a wide range of healthcare products.</Employerdescription>
      <Employerwebsite>https://talent.bayer.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://talent.bayer.com/careers/job/562949976928777</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>1acac706-1e4</externalid>
      <Title>Senior Engineering Manager, Acquisition</Title>
      <Description><![CDATA[<p>Job Title: Senior Engineering Manager, Acquisition</p>
<p>Join Brex, the intelligent finance platform that enables companies to spend smarter and move faster in over 200 markets. As a Senior Engineering Manager, Acquisition, you will lead the engineering group responsible for driving Brex&#39;s growth engine.</p>
<p>About Brex</p>
<p>Brex is a financial technology company that provides corporate cards and banking services to businesses. With a strong focus on innovation and customer satisfaction, Brex has become a leading player in the fintech industry.</p>
<p>Responsibilities</p>
<ul>
<li>Lead and mentor high-performing teams of both product engineers and Salesforce admins and developers, fostering their career development through coaching, feedback, and hands-on guidance.</li>
<li>Drive the architectural vision, technical roadmap, and project execution for our Salesforce platform, GTM applications, and Marketing website (Brex.com), ensuring scalability, performance, and security.</li>
<li>Champion and integrate AI-native solutions within our Sales, Marketing, Operations, and CX systems and workflows of your teams to drive efficiency and unlock new capabilities.</li>
<li>Partner with stakeholders across Sales, Marketing, Operations, and CX, acting as a strategic advisor to translate business needs into a prioritized engineering backlog while being jointly accountable to business metrics such as CAC, payback, and rep capacity.</li>
<li>Own the operational excellence of your group, managing sprint capacity, removing blockers, and ensuring high-velocity, high-quality delivery.</li>
<li>Establish and enforce engineering best practices for Sales and Marketing applications and the Salesforce ecosystem, including advanced DevOps (CI/CD, source control, release management), code quality, and system governance.</li>
<li>Design and oversee the implementation of complex integrations and automation across our GTM systems and third-party applications, including owning build vs buy decisions.</li>
</ul>
<p>Requirements</p>
<ul>
<li>Bachelor&#39;s or Master&#39;s degree in Computer Science, Engineering, or a related field.</li>
<li>8+ years of software engineering experience with a strong technical background, with 3+ years of experience managing or leading multiple technical teams in a high-growth environment.</li>
<li>3+ years of progressive experience in Salesforce development and architecture.</li>
<li>Deep expertise in the Salesforce platform, including Apex, Lightning Web Components, SOQL, and platform APIs.</li>
<li>Experience architecting and building robust integrations between Salesforce and a broader ecosystem of GTM tools and third-party applications.</li>
<li>Excellent interpersonal and relationship-building skills with the ability to manage and communicate effectively with XFN partners (e.g. Sales, Marketing, CX, and Operations) at all levels, from individual contributors to senior leadership.</li>
<li>A strategic mindset with the ability to balance long-term platform health with immediate business needs, and a proven ability to prioritize and push back when work is not aligned with company goals.</li>
<li>A passion for leveraging AI to improve both team productivity and the products you build.</li>
</ul>
<p>Bonus Points</p>
<ul>
<li>Salesforce certifications (e.g., Platform Developer II, Application/System Architect).</li>
<li>Experience managing remote or distributed engineering teams.</li>
<li>Former founders welcome.</li>
<li>Experience building Slack applications and integrations.</li>
<li>Experience with B2B growth.</li>
<li>You have started your own technology venture or were a foundational engineering member of an early-stage start up. We value entrepreneurial spirit &amp; scrappiness!</li>
<li>You are a champion for the customer and constantly put yourself in the shoes of your users, and strive to create an intuitive and delightful experience.</li>
</ul>
<p>Compensation</p>
<p>The expected salary range for this role is $300,000 - $375,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$300,000 - $375,000</Salaryrange>
      <Skills>Salesforce, Apex, Lightning Web Components, SOQL, Platform APIs, DevOps, CI/CD, Source Control, Release Management, Code Quality, System Governance, AI, Machine Learning, Cloud Computing, Data Integration, Security</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Brex</Employername>
      <Employerlogo>https://logos.yubhub.co/brex.com.png</Employerlogo>
      <Employerdescription>Brex is a financial technology company that provides corporate cards and banking services to businesses.</Employerdescription>
      <Employerwebsite>https://brex.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/brex/jobs/8299931002</Applyto>
      <Location>Seattle, Washington, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>995b724b-c85</externalid>
      <Title>Senior Sales Engineer, Partnerships</Title>
      <Description><![CDATA[<p>We are seeking a Senior Sales Engineer, Partnerships to join our team. As a Senior Sales Engineer, you will be responsible for providing technical expertise and strategic enablement to partners, facilitating strategies to pursue avenues of revenue outside of the life sciences. This role bridges technical knowledge and business strategy, supporting partners during discovery, qualification, and solution design to showcase the value of Komodo&#39;s healthcare data and analytics platform.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Serve as a technical lead on 8-10 multiple strategic opportunities, directly influencing the deal cycles and accelerating revenue growth.</li>
<li>Become the definitive subject matter expert on Komodo&#39;s comprehensive suite of healthcare data assets and platform capabilities.</li>
<li>Garner subject matter expertise and ownership of a segment within the Partnerships / Channel Partnerships organization.</li>
<li>Develop scalable technical frameworks, demo environments, and reusable assets that have set new organizational standards with a heavy emphasis on agentic AI workflows.</li>
<li>Drive cross-functional initiatives by partnering with Product, Data Science, and Engineering to deliver customized, innovative solutions.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>7+ years of experience in Sales Engineering or Solutions Engineering with a focus on healthcare data and healthcare technology.</li>
<li>Proven track record of understanding and leveraging AI tools to enhance SaaS products or improve operational workflows.</li>
<li>Expertise in healthcare data (e.g., 837/835 transactions, NDC codes) and its practical applications in analytics, reporting, and decision-making.</li>
<li>Strong technical skills, including experience with APIs, data integration, cloud-based architectures (e.g., AWS, Azure), and analyzing large datasets.</li>
<li>An understanding and proficiency of data science techniques, specifically SQL, Python, and/or R.</li>
<li>Excellent communication and presentation skills, with the ability to train partners and translate complex technical concepts for diverse stakeholders.</li>
</ul>
<p>Preferred Skills:</p>
<ul>
<li>Experience working within the provider, payer, or financial service segments.</li>
<li>Technical certifications in AWS, Azure, or data platforms.</li>
<li>Experience with CRM platforms like Salesforce for managing partner and client interactions.</li>
<li>Familiarity with data visualization tools (e.g., Tableau, Looker) to create impactful partner training materials.</li>
<li>Knowledge of identity resolution and privacy-preserving linking technologies.</li>
<li>Prior experience developing joint business plans and co-sell strategies with channel partners.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$143,000-$193,000 USD</Salaryrange>
      <Skills>Sales Engineering, Healthcare Data, Healthcare Technology, AI Tools, APIs, Data Integration, Cloud-Based Architectures, Data Science Techniques, SQL, Python, R, Excellent Communication, Presentation Skills, Experience Working Within Provider, Payer, or Financial Service Segments, Technical Certifications in AWS, Azure, or Data Platforms, Experience with CRM Platforms Like Salesforce, Familiarity with Data Visualization Tools, Knowledge of Identity Resolution and Privacy-Preserving Linking Technologies, Prior Experience Developing Joint Business Plans and Co-Sell Strategies</Skills>
      <Category>Sales</Category>
      <Industry>Healthcare</Industry>
      <Employername>Komodo Health</Employername>
      <Employerlogo>https://logos.yubhub.co/komodohealth.com.png</Employerlogo>
      <Employerdescription>Komodo Health is a healthcare technology company that aims to reduce the global burden of disease by providing a comprehensive view of the US healthcare system.</Employerdescription>
      <Employerwebsite>https://www.komodohealth.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/komodohealth/jobs/8495825002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>72260a13-6ce</externalid>
      <Title>Web Analytics &amp; Insights Manager</Title>
      <Description><![CDATA[<p>Join the team as Twilio&#39;s next Web Analytics &amp; Insights Manager.</p>
<p>This position is needed to partner in driving a scalable, holistic web analytics implementation strategy while maintaining data integrity of existing setup, ultimately ensuring Twilio.com web tracking capabilities continue to grow, are utilized by the company, and are accurate + reliable. The ideal candidate collaborates with several cross-functional teams at Twilio such as UX, CRO, Web Strategy, Web Dev, Demand Gen, SEO, Content+ more to deliver scalable and easy-to-use web analytics solutions and insights.</p>
<p>As a subject matter expert in this space, you will grow your presence company-wide and eventually own relationships cross-functionally.</p>
<p>We are looking for someone with proven experience in GA4 implementation, business/marketing analytics reporting, analysis and stakeholder communications.</p>
<p>Responsibilities:</p>
<p>Work closely with the Web team to deliver desired web analytics tracking for business stakeholders Own and maintain a holistic GA/web analytics strategy, implementation, troubleshooting, documentation + enablement Brainstorm with technical and non-technical stakeholders on creative ways to meet data requirements, while balancing site performance, holistic tagging strategies, and expanded data capabilities Drive continual web analytics capability improvements, and maintain high standards for data integrity Own and share recurring monthly and adhoc analysis, deep dives, storytelling to uncover the &#39;so what&#39; - providing actionable insights and recommendations across site, page, channel and A/B testing performance Strategize and build scalable solutions that support company-wide enablement and the utilization of web analytics for data-driven decision-making Partner with our CDP SMEs to orchestrate a unified data strategy, maximizing the integration between our CDP and Google Analytics to drive a 360-degree view of the customer journey</p>
<p>Qualifications:</p>
<ul>
<li>5 years of experience in web analytics</li>
<li>Expertise in Google Analytics 4, both implementation and analysis.</li>
<li>Strong knowledge of JavaScript, HTML, and browser dev tools.</li>
<li>Hands-on experience with Google Tag Manager (GTM)</li>
<li>Familiarity with A/B and multivariate testing frameworks, as well as web personalization</li>
<li>Strong communication skills with the ability to communicate effectively with both technical and non-technical audiences</li>
<li>Detail-oriented with a passion for clean, trustworthy data.</li>
<li>Strong analytical skills with a track record of independently exploring data, delivering insights, storytelling, and actionable recommendations.</li>
<li>Excellent collaborator across cross functional teams; skilled in managing stakeholder projects</li>
<li>Highly reliable and accountable.</li>
<li>A self-starter.</li>
<li>Interest or experience in applying AI-driven solutions to analytics workflows.</li>
</ul>
<p>Desired:</p>
<ul>
<li>Familiarity with Big Query, Adobe AEM, Adobe Target</li>
<li>Interest or exposure to fraud monitoring tools and processes.</li>
<li>Experience with Looker Studio, Tableau, or other BI tools.</li>
<li>Working knowledge of Customer Data Platforms (CDPs) and analytics data integrations.</li>
<li>A deeper understanding of:</li>
</ul>
<p>First party and third party cookies Cross domain tracking (including the limitations of web protocols and UTM strategies) Tagging and tracking strategies (e.g., when to use Segment → GA4 vs. GTM → GA4 for optimal efficiency) Ad pixel setup and utms deployment strategies</p>
<p>Travel:</p>
<p>We prioritize connection and opportunities to build relationships with our customers and each other. For this role, approximately 15% travel is anticipated to help you connect in-person in a meaningful way.</p>
<p>What We Offer:</p>
<p>Working at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.</p>
<p>Compensation:</p>
<p>*Please note this role is open to candidates outside of California, Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, New Jersey, New York, Vermont, Washington D.C., and Washington State. The information below is provided for candidates hired in those locations only. The estimated pay ranges for this role are as follows:</p>
<p>Based in Colorado, Hawaii, Illinois, Maryland, Massachusetts, Minnesota, Vermont or Washington D.C. : $106,320 - $132,900. Based in New York, New Jersey, Washington State, or California (outside of the San Francisco Bay area): $112,560 - $140,700. Based in the San Francisco Bay area, California: $125,040 - $156,300.</p>
<p>This role may be eligible to participate in Twilio’s equity plan and corporate bonus plan. All roles are generally eligible for the following benefits: health care insurance, 401(k) retirement account, paid sick time, paid personal time off, paid parental leave. The successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location.</p>
<p>Application deadline information:</p>
<p>Applications for this role are intended to be accepted until April 30, 2026, but may change based on business needs.</p>
<p>Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That&#39;s why we seek out colleagues who embody our values , something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you&#39;re ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn&#39;t what you&#39;re looking for, please consider other open positions.</p>
<p>Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel></Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Google Analytics 4, JavaScript, HTML, Browser dev tools, Google Tag Manager, A/B and multivariate testing frameworks, Web personalization, Communication skills, Analytical skills, Data analysis, Insights, Storytelling, Actionable recommendations, Big Query, Adobe AEM, Adobe Target, Fraud monitoring tools, Looker Studio, Tableau, Customer Data Platforms, Analytics data integrations</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Twilio</Employername>
      <Employerlogo>https://logos.yubhub.co/twilio.com.png</Employerlogo>
      <Employerdescription>Twilio is a cloud communication platform that provides a range of APIs and services for building, scaling, and operating real-time communication and collaboration applications.</Employerdescription>
      <Employerwebsite>https://www.twilio.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/twilio/jobs/7736718</Applyto>
      <Location>Remote - US</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>32a63583-c9b</externalid>
      <Title>Sr. Analyst, Strategic Measurement</Title>
      <Description><![CDATA[<p>At Twilio, we&#39;re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences.\n\nOur dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day.\n\nAs we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding.\n\nJoin the team as Twilio’s next Sr. Analyst, Strategic Measurement.\n\nAbout the job\n\nWe are actively recruiting for this role to fill an existing vacancy. This position is needed to serve as the &quot;New Frontiers&quot; lead for the Marketing Strategy, Analytics &amp; Innovation team, focusing on ambiguous, highly strategic emerging channels, potential programs &amp; measurement optimization/instrumentation.\n\nYou will be responsible for defining, measuring, and accelerating our growth in new digital landscapes, specifically focusing on AI Engine Optimization (AEO), brand measurement and advanced attribution frameworks.\n\nPartnering closely with the global marketing analytics teams, Brand, Demand Gen, PR, Content, and Developer Network teams, you will execute against high-visibility objectives enabling marketing&#39;s business impact to executive leadership.\n\nIdeally, this position has a strong statistical background, a substantial capacity for hypothesis-driven thinking, and the ability to deliver &quot;unmistakable clarity&quot; in complex executive presentations.\n\nResponsibilities\n\nIn this role, you’ll:\n\nAnalyse LLM visibility and Share of Voice, create a continuous feedback loop across various teams to optimise Twilio&#39;s presence in AI search/AEO (answer engine optimisation), drive enablement and operational efforts to ensure consistent tool usage and storyline.\n\nSupport the strategy, data ingestion, and modelling required to build and maintain net new and existing brand measurement components. Drive a single, cohesive narrative combining disparate data sets.\n\nInspire experimentation adoption in teams. Build the frameworks and structure to drive experimentation rigor and clarity of incrementally of marketing initiatives.\n\nSupport Media Mix Modelling (MMM) and Multi-Touch Attribution (MTA) initiatives to understand the impact of various marketing channels on Sign Ups, SQLs, activations, and guide budget allocation.\n\nTranslate complex data sets into actionable insights and strategic recommendations for exec-level audiences.\n\nOwn enablement efforts for emerging channels, sharing accomplishments, KPIs, and cross-team collaborations broadly across the marketing organisation.\n\nQualifications\n\nTwilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply.\n\nIf your career is just starting or hasn’t followed a traditional path, don’t let that stop you from considering Twilio.\n\nWe are always looking for people who will bring something new to the table!\n\n* Required:\n\n5+ years of experience in statistical analysis, experimentation and/or scenario planning.\n\nA strong understanding of SEO and emerging AI Search (AEO/GEO) landscapes.\n\nA hypothesis-driven thinker who does not just report the “what,” but actively tests the “why”. Able to formulate testable guesses and distill the essential from complex marketing data.\n\nExperience with running or interpreting advanced measurement frameworks (Incrementality, MMM, MTA).\n\nHighly reliable and a self-starter.\n\nInterest or experience in applying AI-driven solutions to analytics workflows.\n\nStrong communication skills with the ability to communicate effectively with both technical and non-technical audiences\n\nDetail-oriented with a passion for clean, trustworthy data.\n\nStrong analytical skills.\n\nExcellent collaborator across cross-functional teams; skilled in managing stakeholder projects end-to-end.\n\nDesired:\n\nExperience supporting various stakeholder functions and their insights: Brand, PR, Social, Content, Demand Generation\n\nExperience with BI tools (Tableau, Looker Studio) and modelling (ROI, LTV:CAC, payback).\n\nInterest or exposure to fraud monitoring tools and processes.\n\nWorking knowledge of Customer Data Platforms (CDPs) and analytics data integrations.\n\nLocation\n\nThis role will be remote, and based in Alberta, British Columbia, or Ontario, Canada.\n\nTravel\n\nWe prioritise connection and opportunities to build relationships with our customers and each other.\n\nFor this role, approximately 15% travel is anticipated to help you connect in-person in a meaningful way.\n\nWhat We Offer\n\nWorking at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings programme, and much more.\n\nOfferings vary by location.\n\nBased on role, employees may also be eligible for additional compensation and benefits, including but not limited to incentive programmes, commissions, equity grants, health and wellness benefits, retirement contributions, and paid time off.\n\nThe estimated pay ranges for this role are as follows:\n\n$90,800 - $113,500 CAD\n\nTarget Bonus Percentage 12.50%\n\nThe successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location.\n\nTwilio thinks big. Do you?\n\nWe like to solve problems, take initiative, pitch in when needed, and are always up for trying new things.\n\nThat&#39;s why we seek out colleagues who embody our values , something we call Twilio Magic.\n\nAdditionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.\n\nSo, if you&#39;re ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!\n\nIf this role isn’t what you’re looking for, please consider other open positions.\n\nTwilio is proud to be an equal opportunity employer.\n\nWe do not discriminate based upon race, religion, colour, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics.\n\nWe also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.\n\nQualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act.\n\nAdditionally, Twilio participates in the E-Verify programme in certain locations, as required by law.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$90,800 - $113,500 CAD</Salaryrange>
      <Skills>statistical analysis, experimentation, scenario planning, SEO, AI Search (AEO/GEO) landscapes, hypothesis-driven thinking, advanced measurement frameworks, Incrementality, Media Mix Modelling (MMM), Multi-Touch Attribution (MTA), communication skills, data analysis, clean, trustworthy data, analytical skills, collaboration, project management, BI tools (Tableau, Looker Studio), modelling (ROI, LTV:CAC, payback), fraud monitoring tools and processes, Customer Data Platforms (CDPs), analytics data integrations</Skills>
      <Category>Marketing</Category>
      <Industry>Technology</Industry>
      <Employername>Twilio</Employername>
      <Employerlogo>https://logos.yubhub.co/twilio.com.png</Employerlogo>
      <Employerdescription>Twilio delivers innovative solutions to hundreds of thousands of businesses and empowers millions of developers worldwide to craft personalized customer experiences.</Employerdescription>
      <Employerwebsite>https://www.twilio.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/twilio/jobs/7736722</Applyto>
      <Location>Remote - Canada</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>f86b2355-24d</externalid>
      <Title>Incentive Compensation System Engineer</Title>
      <Description><![CDATA[<p>As an Incentive Compensation Systems Engineer, you will lead the technical implementation, configuration, and optimization of our incentive compensation management (ICM) platform. You will work closely with Revenue Operations, Data Engineering, Finance, Accounting, and HR to translate incentive compensation designs into scalable system configurations, ensuring accurate calculations, reliable integrations, and efficient workflows.</p>
<p><strong>Responsibilities:</strong></p>
<p><strong>ICM Platform Implementation &amp; Configuration</strong></p>
<ul>
<li>Lead the end-to-end implementation and configuration of our ICM platform, including plan modeling, crediting rules, quota management, and payout calculations</li>
<li>Translate compensation plan documents into system logic, ensuring accurate and auditable calculation outputs</li>
<li>Design and maintain system workflows for plan changes, exceptions, and adjustments</li>
<li>Develop and execute testing strategies to validate system accuracy before each compensation cycle</li>
<li>Provide technical guidance during plan design discussions, advising on system capabilities and constraints</li>
</ul>
<p><strong>Data Integration &amp; Architecture</strong></p>
<ul>
<li>Build and maintain integrations between the ICM platform and source systems (CRM, ERP, HRIS, billing platforms)</li>
<li>Ensure data integrity across the compensation data pipeline, from opportunity data through final payout</li>
<li>Partner with Data Engineering and Revenue Operations to define data requirements and resolve data quality issues</li>
<li>Document data flows, transformation logic, and system dependencies</li>
</ul>
<p><strong>System Administration &amp; Optimization</strong></p>
<ul>
<li>Serve as the primary system administrator for the ICM platform, managing user access, security, and system health</li>
<li>Identify opportunities to streamline and automate compensation processes, reducing manual intervention and cycle time</li>
<li>Monitor system performance and troubleshoot calculation errors or integration failures</li>
<li>Evaluate and recommend new technologies, including Claude-powered solutions, to enhance system capabilities</li>
</ul>
<p><strong>Reporting &amp; Analytics</strong></p>
<ul>
<li>Build and maintain reporting infrastructure within the ICM platform to support compensation dashboards and attainment tracking</li>
<li>Partner with Revenue Operations and Accounting to deliver accurate, timely compensation data for forecasting and accruals</li>
<li>Enable self-service reporting for compensation administrators and business partners</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>5+ years of experience implementing or administering ICM platforms (e.g., Xactly, Varicent, Anaplan)</li>
<li>Strong technical skills with experience in SQL, data integration tools, and system configuration</li>
<li>Proven ability to translate business requirements into system logic and maintain accurate, auditable calculations</li>
<li>Experience managing data pipelines and ensuring data quality across integrated systems</li>
<li>Strong problem-solving skills with a systematic approach to debugging and root cause analysis</li>
<li>Excellent documentation practices and ability to communicate technical concepts to non-technical stakeholders</li>
</ul>
<p><strong>Preferred Qualifications:</strong></p>
<ul>
<li>Experience with full ICM platform implementation projects, including vendor selection and data migration</li>
<li>Proficiency with APIs and integration platforms (e.g., Workato, Fivetran, custom ETL)</li>
<li>Experience with Salesforce administration or development</li>
<li>Familiarity with consumption-based compensation models</li>
<li>Experience with Claude Code or other AI-assisted development tools</li>
<li>Background in sales operations, revenue operations, or finance systems</li>
</ul>
<p><strong>Logistics:</strong></p>
<ul>
<li>Minimum education: Bachelor’s degree or an equivalent combination of education, training, and/or experience</li>
<li>Required field of study: A field relevant to the role as demonstrated through coursework, training, or professional experience</li>
<li>Minimum years of experience: Years of experience required will correlate with the internal job level requirements for the position</li>
<li>Location-based hybrid policy: Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices.</li>
<li>Visa sponsorship: We do sponsor visas! However, we aren’t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, data integration tools, system configuration, ICM platforms, Xactly, Varicent, Anaplan, APIs, integration platforms, Workato, Fivetran, custom ETL, Salesforce administration, development, consumption-based compensation models, Claude Code, AI-assisted development tools, sales operations, revenue operations, finance systems</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a public benefit corporation that creates reliable, interpretable, and steerable AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5141849008</Applyto>
      <Location>Ontario, CAN</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>2d68b2bc-c36</externalid>
      <Title>Incentive Compensation System Engineer</Title>
      <Description><![CDATA[<p>As an Incentive Compensation Systems Engineer at Anthropic, you will lead the technical implementation, configuration, and optimization of our incentive compensation management (ICM) platform. You will work closely with Revenue Operations, Data Engineering, Finance, Accounting, and HR to translate incentive compensation designs into scalable system configurations, ensuring accurate calculations, reliable integrations, and efficient workflows.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Leading the end-to-end implementation and configuration of our ICM platform, including plan modeling, crediting rules, quota management, and payout calculations</li>
<li>Translating compensation plan documents into system logic, ensuring accurate and auditable calculation outputs</li>
<li>Designing and maintaining system workflows for plan changes, exceptions, and adjustments</li>
<li>Developing and executing testing strategies to validate system accuracy before each compensation cycle</li>
<li>Providing technical guidance during plan design discussions, advising on system capabilities and constraints</li>
</ul>
<p>In addition, you will build and maintain integrations between the ICM platform and source systems (CRM, ERP, HRIS, billing platforms), ensure data integrity across the compensation data pipeline, and document data flows, transformation logic, and system dependencies.</p>
<p>You will serve as the primary system administrator for the ICM platform, managing user access, security, and system health, and identify opportunities to streamline and automate compensation processes, reducing manual intervention and cycle time.</p>
<p>You will also build and maintain reporting infrastructure within the ICM platform to support compensation dashboards and attainment tracking, and partner with Revenue Operations and Accounting to deliver accurate, timely compensation data for forecasting and accruals.</p>
<p>To be successful in this role, you will need 5+ years of experience implementing or administering ICM platforms, strong technical skills with experience in SQL, data integration tools, and system configuration, and proven ability to translate business requirements into system logic and maintain accurate, auditable calculations.</p>
<p>You will also need experience managing data pipelines and ensuring data quality across integrated systems, strong problem-solving skills with a systematic approach to debugging and root cause analysis, excellent documentation practices, and ability to communicate technical concepts to non-technical stakeholders.</p>
<p>The annual compensation range for this role is $190,000-$270,000 USD.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$190,000-$270,000 USD</Salaryrange>
      <Skills>SQL, data integration tools, system configuration, ICM platforms, plan modeling, crediting rules, quota management, payout calculations, APIs, integration platforms, Salesforce administration, development, consumption-based compensation models, Claude Code, AI-assisted development tools</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a public benefit corporation that creates reliable, interpretable, and steerable AI systems.</Employerdescription>
      <Employerwebsite>https://www.anthropic.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5042002008</Applyto>
      <Location>San Francisco, CA | New York City, NY</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>2bf2ef11-7d6</externalid>
      <Title>Senior Backend Engineer, Data Modeling and Ingestion Platform</Title>
      <Description><![CDATA[<p>We are looking for a Senior Backend Engineer to lead the unification of large, highly rich, and heterogeneous datasets sourced from a wide range of external providers. These datasets are used to power our generative audio models.</p>
<p>Your work will create the foundational dataset that powers our research by building robust, scalable systems for linking, deduplicating, reconciling, and enriching data at massive scale. This role centres on high-impact bulk ingestion and advanced data linkage. You will design the logic, algorithms, and strategies that transform many independent datasets into a unified, high-quality canonical asset used throughout the company.</p>
<p>You will collaborate closely with ML researchers and product teams, working with tools such as BigQuery, Dataflow/Beam, TFRecords, and,where beneficial,distributed systems frameworks like Ray. Familiarity with ML workflows using JAX or multihost training is a plus, as the datasets you produce will directly support that ecosystem.</p>
<p>Responsibilities:</p>
<ul>
<li>Build high-throughput bulk ingestion workflows to integrate datasets from multiple external providers.</li>
<li>Design and implement scalable entity-resolution solutions, including record linking, deduplication, clustering, and conflict arbitration.</li>
<li>Create and refine matching logic, decision rules, and similarity functions to align datasets with high accuracy and strong coverage.</li>
<li>Define and track data quality indicators, such as overlap metrics, match precision/recall, duplicate rates, and completeness.</li>
<li>Prepare training-ready datasets in formats such as TFRecords, and structure data to meet ML research requirements.</li>
<li>Develop processing components using Dataflow (Beam) and manage large analytical workloads in BigQuery.</li>
<li>Leverage frameworks like Ray to accelerate large-scale experiments, feature extraction, and research-oriented data preparation.</li>
<li>Collaborate with ML researchers to anticipate downstream requirements and evolve linkage strategies as new sources and use cases emerge.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Experience working with large, heterogeneous datasets from multiple providers or domains.</li>
<li>Strong background in entity resolution, deduplication, data unification, or related large-scale data integration techniques.</li>
<li>Proficiency in Python, with an emphasis on efficient, scalable data processing.</li>
<li>Experience with BigQuery, Google Dataflow/Apache Beam, or similar batch-processing frameworks.</li>
<li>Familiarity with data validation, normalization, reconciliation, and building consistent views across diverse data sources.</li>
<li>Ability to craft well-structured matching and decision strategies that balance accuracy, completeness, and computational efficiency.</li>
<li>Comfortable iterating quickly on pragmatic solutions, balancing correctness with time-to-delivery.</li>
<li>Clear communication skills and the ability to collaborate closely with ML and research teams.</li>
</ul>
<p>Nice to Have:</p>
<ul>
<li>Knowledge of architecting Google Cloud Platform systems at scale.</li>
<li>Experience with distributed compute frameworks such as Ray, Spark, or Flink.</li>
<li>Understanding of JAX-based ML pipelines, multihost training setups, or large-scale data preparation for accelerator-backed workflows.</li>
<li>Familiarity with TFRecords or other high-volume training data formats.</li>
<li>Exposure to ranking, clustering, or statistical similarity modeling.</li>
<li>Experience with Go, NextJS, and/or React Native to contribute to full-stack development.</li>
</ul>
<p>Why Join Us:</p>
<ul>
<li>You will design the core dataset that underpins our research, product development, and generative audio models.</li>
<li>You&#39;ll work on large-scale data challenges that require creativity, algorithmic thinking, and engineering excellence.</li>
<li>You&#39;ll join a small, fast-moving team where your decisions shape the direction of our data and research capabilities.</li>
</ul>
<p>Benefits:</p>
<ul>
<li>Highly competitive salary and equity.</li>
<li>Quarterly productivity budget.</li>
<li>Flexible time off.</li>
<li>Fantastic office location in Manhattan.</li>
<li>Productivity package, including ChatGPT Plus, Claude Code, and Copilot.</li>
<li>Top-notch private health, dental, and vision insurance for you and your dependents.</li>
<li>401(k) plan options with employer matching.</li>
<li>Concierge medical/primary care through One Medical and Rightway.</li>
<li>Mental health support from Spring Health.</li>
<li>Personalized life insurance, travel assistance, and many other perks.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,000 - $220,000</Salaryrange>
      <Skills>Python, BigQuery, Dataflow/Beam, TFRecords, Ray, JAX, Multihost training, Entity resolution, Deduplication, Data unification, Large-scale data integration, Go, NextJS, React Native, Distributed compute frameworks, Ranking, Clustering, Statistical similarity modeling</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Udio</Employername>
      <Employerlogo>https://logos.yubhub.co/udio.com.png</Employerlogo>
      <Employerdescription>Udio is a company that creates generative audio models.</Employerdescription>
      <Employerwebsite>https://udio.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/udio/jobs/4988140008</Applyto>
      <Location>New York</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>7b750523-8ff</externalid>
      <Title>Staff Software Engineer, Data Engineering</Title>
      <Description><![CDATA[<p>We are seeking a Staff Software Engineer to lead the technical strategy and implementation of our enterprise data architecture, governance foundations, and analytics enablement tooling.</p>
<p>In this role, you will be the primary engineering counterpart to the Senior Product Manager for Data Enablement &amp; Governance, jointly shaping the roadmap for enterprise analytics, shared definitions, and the tools that help Omada answer questions faster and more reliably.</p>
<p>You will design and evolve core data products, define patterns and standards used across the company, and drive the technical execution of initiatives that ensure our metrics, reports, and data products are scalable, governed, and trustworthy.</p>
<p>This is a high-impact, cross-functional Staff role working across Data Engineering, Data Science, Analytics, Product, IT, and business leaders.</p>
<p><strong>Key Responsibilities:</strong></p>
<p><strong>Enterprise Data Architecture</strong></p>
<ul>
<li>Own the vision and technical roadmap for Omada&#39;s enterprise data architecture, spanning ingestion, storage, modeling, and serving layers for analytics and applied statistics use cases.</li>
<li>Design, implement, and evolve scalable, secure, and cost-efficient data solutions (datalakes, warehouses, marts, semantic layers) that support governed, cross-functional analytics and self-service.</li>
<li>Define and socialize architectural patterns, data contracts, and integration standards used by data and product teams across the organization.</li>
<li>Anticipate future needs (e.g., new product lines, new modalities, AI/ML workloads) and drive proactive architectural changes rather than reacting to incidents or point-in-time requests.</li>
</ul>
<p><strong>Data Modeling, Quality, and Governance Foundations</strong></p>
<ul>
<li>Lead the design of logical and physical data models to support enterprise metrics, dashboards, and ad hoc analytics, with a focus on reusability and clear ownership.</li>
<li>Implement robust data quality, validation, and monitoring frameworks that underpin trusted “single source of truth” definitions for core concepts (e.g., active member, MAU, GLP-1 member).</li>
<li>Partner with the Senior Product Manager, Data Enablement &amp; Governance to translate governance decisions (definitions, ownership, change-management processes) into concrete technical implementations in the data platform.</li>
<li>Set standards and review mechanisms to ensure new pipelines, marts, and reports align with enterprise definitions and governance policies.</li>
<li>Continuously improve performance, scalability, and cost-efficiency of data workflows and storage; lead deep dives and remediation for complex production issues.</li>
</ul>
<p><strong>Enterprise Data Products Lifecycle</strong></p>
<ul>
<li>In close partnership with the Senior PM, define and deliver core, reusable data products (e.g., engagement, clinical, financial, client, care delivery datasets) that power dashboards, reporting, and self-service analytics.</li>
<li>Co-Architect and implement technical foundations for AI-assisted analytics tools, governed semantic layers, and reporting applications that make analysts and business users more efficient.</li>
<li>Partner with Product and Engineering teams owning tools like Amplitude, Tableau, and internal reporting tools to ensure consistent instrumentation, mapping to enterprise definitions, and scalable access patterns.</li>
<li>Translate business and product requirements into resilient schemas, data services, and interfaces that are usable, maintainable, and auditable.</li>
<li>Ensure production data delivery meets defined SLAs and supports downstream BI, reporting apps, and applied statistics workloads.</li>
<li>Play a key role in cross-functional forums (e.g., Data Governance Committee, analytics communities) as the technical voice for feasibility, risk, and long-term platform health.</li>
</ul>
<p><strong>Technical Leadership, Mentorship, and Culture</strong></p>
<ul>
<li>Lead large, multi-team technical initiatives,from design to implementation and rollout,setting a high bar for design docs, reviews, and execution quality.</li>
<li>Mentor senior and mid-level engineers, elevating the team’s skills in data modeling, pipeline design, governance, and platform thinking.</li>
<li>Help shape playbooks for how product squads and spokes engage with central data teams on new metrics, data products, and applied stats projects.</li>
<li>Partner closely with Analytics, Data Science, Product, and business leaders to ensure data architecture and governance decisions are aligned with company OKRs and measurable business value.</li>
<li>Proactively identify complexity, duplication, and fragility in existing systems; drive simplification and standardization with sustainable solutions.</li>
<li>Model Omada’s values in day-to-day work, fostering a culture of trust, context-seeking, bold thinking, and high-impact delivery.</li>
</ul>
<p><strong>About You:</strong></p>
<ul>
<li>8+ years of experience building, maintaining, and orchestrating scalable data platforms and high-quality production pipelines, including significant experience in analytics or warehousing environments.</li>
<li>Demonstrated Staff-level impact: leading cross-team technical initiatives, making architectural decisions that shaped a multi-year roadmap, and influencing stakeholders beyond your immediate team.</li>
<li>Deep experience with cloud data ecosystems (e.g., AWS) and modern data warehouses (e.g., Redshift, Snowflake, BigQuery), including MPP query optimization.</li>
<li>Strong background in data modeling for OLTP and OLAP, and designing reusable data products for BI, reporting, and advanced analytics.</li>
<li>Hands-on experience implementing data quality, observability, and governance frameworks, ideally in a regulated or PHI/PII-sensitive environment.</li>
<li>Experience partnering with Product Management and Analytics to define and deliver platform capabilities, not just point solutions.</li>
</ul>
<p><strong>Technical Skills:</strong></p>
<ul>
<li>Strong proficiency in SQL (analytical and performance-tuned) and experience with relational and MPP databases.</li>
<li>Proficiency in at least one modern programming language used in data engineering (e.g., Python, Java, Scala) and comfort applying software engineering best practices (testing, CI/CD, code review).</li>
<li>Experience with workflow orchestration and data integration tools (e.g., Airflow) and event-driven or streaming patterns where appropriate.</li>
<li>Familiarity with BI and analytics tools (e.g., Tableau, Amplitude, or similar) and how they integrate with governed data layers.</li>
<li>Experience with data governance concepts (ownership, lineage, definitions, access controls) and their technical implementation in a modern data stack.</li>
<li>Familiarity with AI tools for development.</li>
</ul>
<p><strong>Communication &amp; Working Style:</strong></p>
<ul>
<li>Excellent communication and collaboration skills, with the ability to convey complex technical concepts to non-technical stakeholders.</li>
<li>Highly self-directed and comfortable operating in ambiguous, cross-functional problem spaces, creating clarity and direction where none exists.</li>
<li>Strong sense of ownership and bias for impact; you care about outcomes for members, customers, and internal users, not just elegant systems.</li>
</ul>
<p><strong>Benefits:</strong></p>
<ul>
<li>Competitive salary with generous annual cash bonus</li>
<li>Equity grants</li>
<li>Remote first work from home culture</li>
<li>Flexible Time Off to help you recharge</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Cloud data ecosystems, Modern data warehouses, MPP query optimization, Data modeling, Data quality, Data governance, Workflow orchestration, Data integration, Event-driven or streaming patterns, BI and analytics tools, AI tools for development</Skills>
      <Category>Engineering</Category>
      <Industry>Healthcare</Industry>
      <Employername>Omada Health</Employername>
      <Employerlogo>https://logos.yubhub.co/omadahealth.com.png</Employerlogo>
      <Employerdescription>Omada Health is a healthcare technology company that provides digital therapeutics for chronic disease management.</Employerdescription>
      <Employerwebsite>https://www.omadahealth.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/omadahealth/jobs/7753330</Applyto>
      <Location>Remote, USA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>5ec63ea6-5a3</externalid>
      <Title>Data Engineer</Title>
      <Description><![CDATA[<p>At Neighbor, we&#39;re building the largest hyperlocal marketplace the world has ever seen. As a Data Engineer, you will be the core engineering resource responsible for building, scaling, and optimizing the data infrastructure that transforms raw events into high-fidelity, actionable intelligence.</p>
<p>This engineering resource will be the cornerstone of our data infrastructure, responsible for extraction, transform, and load of the data that powers our nation-wide, best-in-class marketplace. By implementing software engineering best practices and scalable solutions, this role is critical in empowering the CEO, executive team, managers, and individual contributors with the robust and trustworthy intelligence needed to scale and innovate across our marketplace.</p>
<p><strong>Primary Responsibilities</strong></p>
<ul>
<li>Design, implement, and maintain scalable data transformation layers and code-first orchestration frameworks to ensure the delivery of high-fidelity, reusable data models</li>
<li>Design and build robust pipelines to ingest data from diverse sources (APIs, logs, relational DBs)</li>
<li>Ensure the reliable and timely execution of all critical data pipelines (ETLs/ELTs) to maintain data integrity and freshness</li>
<li>Standardize analytics workflows by integrating software engineering best practices, including version control, CI/CD pipelines, and automated data validation protocols</li>
<li>Develop and refine a robust semantic layer to facilitate self-service analytics, enabling stakeholders to derive insights without exposure to underlying architectural complexities</li>
<li>Monitor and optimize cloud compute utilization and data model performance to ensure high availability and low-latency reporting during periods of rapid data scaling</li>
<li>Serve as a strategic technical partner to leadership across Product, Engineering, Marketing, and Finance to align data infrastructure with organizational objectives</li>
<li>Become a subject matter expert on the product ecosystem, user behavior, and marketing life cycles to better translate raw data into business value</li>
<li>Serve as a versatile technical resource capable of stepping into the Data Analyst capacity when necessary,performing deep-dive quantitative analysis and building sophisticated visualizations to support executive decision-making</li>
<li>Mentor the data analytics team on advanced technical methodologies to foster a culture of engineering excellence and data autonomy</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>3+ years of experience in data engineering or analytics engineering</li>
<li>Bachelor&#39;s degree in quantitative and/or technical fields (Math, Physics, Statistics, Economics, Computer Science, Engineering, etc.) OR 5+ years work experience as a Data Engineer</li>
<li>Expert-level mastery of SQL, with the ability to write, tune, and optimize complex queries for high-volume environments</li>
<li>Strong command of at least one major programming language used for data processing</li>
<li>Hands-on experience designing and maintaining data lakes or cloud-based data warehouses</li>
<li>Deep understanding of data integration patterns, including data ingestion, transformation, and automated cleansing (ETL/ELT)</li>
<li>Experience applying scientific, mathematical, or statistical techniques to analyze data and build predictive models</li>
<li>Advanced ability to translate complex datasets into actionable narratives using modern business intelligence and reporting tools</li>
<li>A proven track record of using quantitative analysis to solve ambiguous problems and drive strategic decision-making in a fast-paced environment</li>
<li>Exceptional ability to collaborate with non-technical stakeholders, translating business requirements into technical specs and vice versa</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Generous Stock options</li>
<li>Medical, dental, and vision insurance</li>
<li>Generous PTO</li>
<li>11 paid company holidays</li>
<li>Hybrid work model - WFH every Monday</li>
<li>401(k) plan</li>
<li>Infant care leave</li>
<li>On-site gym/showers open 24/7</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Programming languages, Data lakes, Cloud-based data warehouses, Data integration patterns, Scientific, mathematical, or statistical techniques, Business intelligence and reporting tools</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Neighbor</Employername>
      <Employerlogo>https://logos.yubhub.co/neighbor.com.png</Employerlogo>
      <Employerdescription>Neighbor is a marketplace for self storage and parking, operating across almost every U.S. city.</Employerdescription>
      <Employerwebsite>https://neighbor.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/neighbor/da1304b7-89ad-4ac0-99e8-9c0cf8284f1c</Applyto>
      <Location>U.S.</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>ba26ba46-932</externalid>
      <Title>Solutions Engineer - Partnerships</Title>
      <Description><![CDATA[<p>About CODE</p>
<p>Hebbia is an AI platform for investors and bankers that generates alpha and drives upside.</p>
<p>As a Solutions Engineer - Partnerships at Hebbia, you will be the technical and strategic lead for activating our data and system partnerships in the field. You will operate at the intersection of partnerships, sales, and product , ensuring that Hebbia’s integrations with market data providers and fintech platforms translate into real, differentiated value during pre-sales and customer engagements.</p>
<p>Responsibilities</p>
<p>Own the technical strategy for partnership-driven deals, from discovery through close
Lead joint pre-sales motions with partners, activating integrations with market data providers, content platforms, and fintech systems
Develop a deep understanding of partner datasets, APIs, and system capabilities, and translate them into differentiated Hebbia solutions
Design and configure integrated AI-powered workflows that combine Hebbia with partner data and systems to create real-world value
Act as the technical face of partnerships in the field, supporting co-selling, joint demos, and strategic accounts
Deliver high-stakes integrated demos and solution narratives to senior stakeholders, showcasing cross-platform value
Identify gaps in integrations or positioning, and drive feedback loops to product and partnerships teams
Surface risks early across multi-party deals, and align stakeholders to accelerate activation and deal progression
Enable internal teams (Sales, Partnerships) with clear frameworks and best practices for positioning ecosystem value</p>
<p>Requirements</p>
<p>5+ years in a customer-facing technical role in complex enterprise environments (Solutions Engineering, Sales Engineering, Consulting, or similar)
Experience working with market data, financial datasets, or fintech platforms (e.g., market data platforms, CRMs, virtual data rooms, portfolio monitoring tools, research platforms, or data infrastructure systems)
Strong understanding of the financial technology ecosystem and how systems and data integrate across workflows
Comfortable driving partnership or ecosystem-led pre-sales motions, including joint solutions and co-selling
Able to translate complex, multi-system environments into clear, structured solution strategies
Strong executive presence with a bias for ownership in high-ambiguity, performance-driven environments</p>
<p>What Success Looks Like in 6–12 Months</p>
<p>You independently lead technical discovery, demos, and evaluations for partnership-driven opportunities, embedding partner data and systems into Hebbia solutions
You are a trusted partner to Sales and Partnerships, advancing deals by clarifying ecosystem value and strengthening joint solution narratives
You activate partnerships in the field, translating integrations into tangible customer outcomes across pre-sales and early deployments
You deliver integrated, high-impact demos that showcase Hebbia across market data, CRMs, and other fintech systems
You proactively identify risks across multi-system, multi-party environments and drive alignment on mitigation strategies
You meaningfully contribute to pipeline and revenue influenced by partnerships, improving win rates and deal velocity
You build reusable assets and playbooks that scale how Hebbia and partners position and deliver joint value
You provide structured feedback on integrations and ecosystem gaps, and are viewed internally as a go-to leader for partnership activation</p>
<p>Why Hebbia</p>
<p>Hebbia is redefining how teams work with complex information, turning unstructured data into clarity, speed, and confident decision-making. As a Solutions Engineer, you’ll sit at the center of that transformation, working directly with some of the most demanding and thoughtful buyers in the world.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>market data, financial datasets, fintech platforms, AI-powered workflows, data integration, ecosystem value, solution narratives, complex systems, structured solution strategies</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Hebbia</Employername>
      <Employerlogo>https://logos.yubhub.co/hebbia.com.png</Employerlogo>
      <Employerdescription>Hebbia is an AI platform for investors and bankers that generates alpha and drives upside. Founded in 2020 by George Sivulka and backed by Peter Thiel and Andreessen Horowitz.</Employerdescription>
      <Employerwebsite>https://hebbia.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/hebbia/jobs/4684835005</Applyto>
      <Location>New York City</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>89e6976d-c57</externalid>
      <Title>GTM Engineer</Title>
      <Description><![CDATA[<p>We&#39;re looking for a GTM Engineer to own the technical backbone of our revenue engine. This is a builder role: you&#39;ll design, ship, and continuously improve the systems, automations, and AI-powered workflows that help our GTM teams operate at their best.</p>
<p>You&#39;ll own our full GTM tech stack including Salesforce, Outreach, Gong, Apollo, Clay, Claude, and more, and be responsible for how we use AI to transform the way our field teams work. This is a high-visibility, high-ownership role that sits at the intersection of revenue strategy and technical execution.</p>
<p>Responsibilities:</p>
<ul>
<li>Identify, build, and iterate on AI-powered workflows using Hebbia, Claude, Clay, and adjacent tools for both pre- and post-sales teams</li>
<li>Build and maintain prompt libraries and role-specific AI assistants that give field team members leverage in their day-to-day work; set up measurement frameworks to track what&#39;s actually moving the needle</li>
<li>Own, administer, and continuously improve our core GTM tools, serving as the internal subject matter expert for each</li>
<li>Keep all systems integrated and healthy, with clean data flowing across the stack so that every downstream workflow, report, and automation is built on a foundation that can be trusted</li>
<li>Audit the full GTM workflow for manual, repetitive work and systematically replace it with automation across data capture, field updates, handoffs, lead enrichment, sequence enrollment, and signal monitoring</li>
<li>Work closely with GTM and Operations to understand pain points, translate problems into technical solutions, and rollout processes and workflows</li>
</ul>
<p>Requirements:</p>
<ul>
<li>4 to 7 years of experience in Revenue Operations, GTM Systems, GTM Engineering, or a closely adjacent technical role in a B2B SaaS environment</li>
<li>Practical experience with prompt engineering and building AI-assisted workflows using LLM APIs or tools like Claude</li>
<li>Deep hands-on experience with GTM tools including Salesforce, Gong, Outreach, Apollo, and Clay, with a track record of building workflows and automations that teams actually use</li>
<li>Comfortable working directly with APIs, webhooks, and data integrations. You can connect systems and debug sync issues without filing an engineering ticket</li>
<li>Experience owning vendor relationships including renewals, evaluations, and contract negotiations</li>
<li>Experience in fast-paced environments and be comfortable working both as part of a team and independently</li>
<li>Demonstrated ability to prioritize workload and manage multiple concurrent projects</li>
<li>Strong verbal and written communication skills; ability to work effectively with cross-functional teams</li>
</ul>
<p>Total cash compensation range for this role is $140,000 to $190,000.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$140,000 - $190,000</Salaryrange>
      <Skills>Revenue Operations, GTM Systems, GTM Engineering, LLM APIs, Claude, Salesforce, Gong, Outreach, Apollo, Clay, APIs, Webhooks, Data Integrations</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Hebbia</Employername>
      <Employerlogo>https://logos.yubhub.co/hebbia.com.png</Employerlogo>
      <Employerdescription>Hebbia is an AI platform for investors and bankers that generates alpha and drives upside, backed by Andreessen Horowitz and founded in 2020. The company powers investment decisions for major asset managers.</Employerdescription>
      <Employerwebsite>https://hebbia.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/hebbia/jobs/4679666005</Applyto>
      <Location>New York City</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>d44f63bb-9c5</externalid>
      <Title>GTM Systems Lead</Title>
      <Description><![CDATA[<p>We&#39;re looking for a GTM Systems Lead to own the technical backbone of our revenue engine. This is a builder role: you&#39;ll design, ship, and continuously improve the systems, automations, and AI-powered workflows that help our GTM teams operate at their best.</p>
<p>You&#39;ll own our full GTM tech stack including Salesforce, Outreach, Gong, Apollo, Clay, Claude, and more, and be responsible for how we use AI to transform the way our field teams work. This is a high-visibility, high-ownership role that sits at the intersection of revenue strategy and technical execution.</p>
<p>Responsibilities:</p>
<ul>
<li>Identify, build, and iterate on AI-powered workflows using Hebbia, Claude, Clay, and adjacent tools for both pre- and post-sales teams</li>
<li>Build and maintain prompt libraries and role-specific AI assistants that give field team members leverage in their day-to-day work; set up measurement frameworks to track what&#39;s actually moving the needle</li>
<li>Own, administer, and continuously improve our core GTM tools, serving as the internal subject matter expert for each</li>
<li>Keep all systems integrated and healthy, with clean data flowing across the stack so that every downstream workflow, report, and automation is built on a foundation that can be trusted</li>
<li>Audit the full GTM workflow for manual, repetitive work and systematically replace it with automation across data capture, field updates, handoffs, lead enrichment, sequence enrollment, and signal monitoring</li>
</ul>
<p>Cross-Functional Partnership:</p>
<ul>
<li>Work closely with the GTM and Operations teams to understand pain points, translate problems into technical solutions, and rollout processes and workflows</li>
</ul>
<p>Who You Are:</p>
<ul>
<li>4 to 7 years of experience in Revenue Operations, GTM Systems, GTM Engineering, or a closely adjacent technical role in a B2B SaaS environment</li>
<li>Practical experience with prompt engineering and building AI-assisted workflows using LLM APIs or tools like Claude</li>
<li>Deep hands-on experience with GTM tools including Salesforce, Gong, Outreach, Apollo, and Clay, with a track record of building workflows and automations that teams actually use</li>
<li>Comfortable working directly with APIs, webhooks, and data integrations. You can connect systems and debug sync issues without filing an engineering ticket</li>
<li>Experience owning vendor relationships including renewals, evaluations, and contract negotiations</li>
<li>Experience in fast-paced environments and be comfortable working both as part of a team and independently</li>
<li>Demonstrated ability to prioritize workload and manage multiple concurrent projects</li>
<li>Strong verbal and written communication skills; ability to work effectively with cross-functional teams</li>
</ul>
<p>Compensation:</p>
<ul>
<li>The total cash compensation range for this role is $140,000 to $190,000. This range may be inclusive of several career levels at Hebbia and will be narrowed during the interview process based on the candidate’s experience and qualifications. Adjustments outside of this range may be considered for candidates whose qualifications significantly differ from those outlined in the job description.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$140,000 - $190,000</Salaryrange>
      <Skills>Revenue Operations, GTM Systems, GTM Engineering, Prompt Engineering, LLM APIs, Claude, Salesforce, Gong, Outreach, Apollo, Clay, APIs, Webhooks, Data Integrations</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Hebbia</Employername>
      <Employerlogo>https://logos.yubhub.co/hebbia.com.png</Employerlogo>
      <Employerdescription>Hebbia is an AI platform that generates alpha and drives upside for investors and bankers. Founded in 2020, it powers investment decisions for major asset managers.</Employerdescription>
      <Employerwebsite>https://hebbia.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/hebbia/jobs/4679879005</Applyto>
      <Location>New York City</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>7619176a-424</externalid>
      <Title>Forward Deployed Engineer</Title>
      <Description><![CDATA[<p>You will spend the majority of your time embedded with Hebbia&#39;s most strategic customers, building the last mile of our platform for their specific workflows, data, and domain. This is a hands-on engineering role. You write production code, you ship it, you own it.</p>
<p>As a Forward Deployed Engineer, you are the bridge between Hebbia&#39;s platform and the real-world complexity of our customers&#39; environments. You sit with the customer&#39;s team, understand their hardest problems, and build solutions that make Hebbia indispensable. Then you bring what you&#39;ve learned back to our engineering and product teams to make the platform better for everyone.</p>
<p>This role is for engineers who want to combine deep technical work with direct customer impact. You will see your code create value in days, not months. The FDE team operates at the intersection of engineering and go-to-market. You will work closely with our core engineering team,shared code review, architecture alignment, deploy pipelines,and with our account teams who direct where you deploy and what you focus on. Our team works in person 5 days a week at our offices in NYC and SF.</p>
<p>Responsibilities:</p>
<ul>
<li>Embed with strategic accounts to deeply understand their domain, data, and workflows</li>
<li>Build custom integrations, workflow automations, and domain-specific solutions on top of Hebbia&#39;s platform</li>
<li>Write production code that deploys through our CI/CD pipelines and meets our engineering standards</li>
<li>Own the technical relationship with the customer&#39;s team during your engagement</li>
<li>Prototype fast, validate with the customer, iterate, and ship</li>
<li>Return from engagements and work with engineering and product to generalize reusable patterns into platform capabilities</li>
<li>Participate in code review, on-call rotation, and architecture discussions alongside core engineering</li>
<li>Build connectors to customer data sources and document management systems</li>
</ul>
<p>Who You Are:</p>
<ul>
<li>5+ years software development experience at a venture-backed startup or top technology firm</li>
<li>Strong full-stack engineering skills. You build across the stack: APIs, data pipelines, frontend when needed, infrastructure when needed.</li>
<li>Comfortable working in ambiguity. Customer problems are messy and underspecified. You figure it out.</li>
<li>High customer empathy. You enjoy sitting with users, understanding their workflows, and translating pain points into technical solutions.</li>
<li>Fast and pragmatic. You prototype, validate, and ship in days and weeks, not quarters.</li>
<li>Strong communicator. You are the primary technical point of contact for the customer. You can talk to both engineers and executives.</li>
<li>Experience with cloud platforms (e.g., AWS) and modern backend technologies (Python, TypeScript, Go)</li>
<li>Experience with data integrations, ETL pipelines, or enterprise data systems (S3, Snowflake, SharePoint, etc.) is a plus</li>
<li>Experience with LLMs, RAG systems, or applied AI is a plus but not required</li>
<li>Prior experience in finance, legal, or consulting domains is a plus</li>
<li>Experience with customer-facing engineering roles (solutions engineering, professional services, or similar) is a plus</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,000 to $300,000</Salaryrange>
      <Skills>Full-stack engineering, Cloud platforms (e.g., AWS), Modern backend technologies (Python, TypeScript, Go), Data integrations, ETL pipelines, or enterprise data systems (S3, Snowflake, SharePoint, etc.), Customer-facing engineering roles (solutions engineering, professional services, or similar), LLMs, RAG systems, or applied AI, Finance, legal, or consulting domains</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Hebbia</Employername>
      <Employerlogo>https://logos.yubhub.co/hebbia.com.png</Employerlogo>
      <Employerdescription>Hebbia is an AI platform that generates alpha and drives upside for investors and bankers. Founded in 2020, it powers investment decisions for large asset managers.</Employerdescription>
      <Employerwebsite>https://hebbia.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/hebbia/jobs/4679338005</Applyto>
      <Location>New York City; San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>b3de0848-388</externalid>
      <Title>Solutions Engineer</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Solutions Engineer to join our team. As a Solutions Engineer, you will collaborate with our Sales and Success teams to understand customer needs and design tailored solutions that help them drive towards their desired outcomes.</p>
<p>Your responsibilities will include:</p>
<p>Designing solutions that meet customer needs and drive business value
Co-owning product and technical evaluations with sales teams
Defining scope, environments, dependencies, success criteria, and execution plans
Translating customer workflows into scalable integration designs
Validating technical readiness through structured discovery before delivery
Guiding customers through enterprise-grade reviews, security questionnaires, data governance evaluations, and compliance alignment
Consulting with legal operations, IT admins, and firm executives to drive adoption strategy, AI safety best practices, governance, and long-term expansion</p>
<p>To be successful in this role, you will need strong technical expertise, particularly in SaaS architectures, APIs, authentication models, and data integration patterns. You should also have experience in pre-sales support, clear communication skills, and the ability to connect technical design to measurable business value and customer impact.</p>
<p>Nice to have skills include experience in high-growth startup environments, understanding of legal tech, AI, or enterprise data environments, and the ability to context switch between pre-sales strategy and hands-on technical work.</p>
<p>Benefits include competitive salary and equity, 401(k) program with employer matching, health, dental, vision, and life insurance, short-term and long-term disability, commuter benefits, autonomous work environment, office setup reimbursement, flexible time off, and quarterly team gatherings.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SaaS architectures, APIs, authentication models, data integration patterns, pre-sales support, clear communication skills, technical design, measurable business value, customer impact, high-growth startup environments, legal tech, AI, enterprise data environments, context switching</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Eve</Employername>
      <Employerlogo>https://logos.yubhub.co/eve.com.png</Employerlogo>
      <Employerdescription>Eve is a legal technology company that provides AI-driven solutions to plaintiff law firms.</Employerdescription>
      <Employerwebsite>https://eve.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/Eve/20610ca6-ebd4-4344-b07c-60c9328f3307</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>335b661c-041</externalid>
      <Title>Protocol Engineer</Title>
      <Description><![CDATA[<p><strong>About Daylight</strong></p>
<p>Daylight is a decentralized energy company building a protocol that converts distributed solar and storage systems into yield-bearing infrastructure.</p>
<p><strong>The Role</strong></p>
<p>As a Protocol Engineer at Daylight, you&#39;ll lead the development of our core smart contracts and decentralized backend systems that underpin our onchain energy network. Our protocol interacts with energy hardware in the real world and tokenizes the value of flexible energy capacity into crypto-native incentives,bridging digital and physical infrastructure through verifiable smart contracts, oracles, and data integrations.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design and implement Solidity smart contracts that underpin Daylight’s decentralized energy markets</li>
<li>Lead architecture decisions for our EVM-based systems across mainnet and L2s</li>
<li>Manage crosschain deployments and integrations using LayerZero, Wormhole, or similar technologies</li>
<li>Develop core logic for staking, token issuance, energy capacity verification, and rewards</li>
</ul>
<p><strong>Data Integration &amp; Oracles</strong></p>
<ul>
<li>Architect secure offchain data flows to feed real-world energy data into onchain systems</li>
<li>Work with oracles and custom data infrastructure to bridge between energy hardware, APIs, and contracts</li>
<li>Design and maintain trust assumptions, verification mechanisms, and upgrade paths</li>
</ul>
<p><strong>Backend Engineering</strong></p>
<ul>
<li>Support Daylight’s core backend team as needed with Python/Django development</li>
<li>Help design and maintain APIs that interact with contracts and power our frontend applications</li>
<li>Participate in DevOps, testing, and deployment of protocol-related services</li>
</ul>
<p><strong>Security &amp; Auditing</strong></p>
<ul>
<li>Lead internal security reviews and coordinate third-party audits of smart contracts</li>
<li>Stay up-to-date on known exploits and best practices in DeFi and crosschain architecture</li>
</ul>
<p><strong>Product Strategy &amp; R&amp;D</strong></p>
<ul>
<li>Contribute to protocol incentive design, economic modeling, and simulations</li>
<li>Evaluate new technologies, frameworks, and cryptoeconomic primitives to inform</li>
</ul>
<p><strong>What we’re looking for</strong></p>
<ul>
<li>3+ years of experience developing and deploying Solidity smart contracts in production</li>
<li>Deep familiarity with DeFi protocols,especially stablecoins, lending markets, staking, or yield systems</li>
<li>Experience with crosschain development (e.g., LayerZero, Wormhole, Axelar)</li>
<li>Proven experience integrating offchain data via oracles (Chainlink, Chronicle, custom feeds)</li>
<li>Understanding of EVM security, auditing practices, and gas optimization</li>
<li>Comfortable writing and reviewing Python code and supporting traditional backend engineering tasks</li>
<li>Passion for clean energy, decentralization, and building systems with real-world impact</li>
<li>Based in NYC, excited to collaborate in-office</li>
</ul>
<p><strong>What we offer</strong></p>
<ul>
<li>Competitive salary and equity package</li>
<li>Medical, dental, and vision insurance</li>
<li>Monthly wellness and Wellhub stipend</li>
<li>Opportunity to shape a category-defining protocol from the ground up</li>
<li>Work in-person with a deeply motivated team in New York City</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Solidity, DeFi protocols, Crosschain development, Offchain data integration, EVM security, Python, Django, L2-native or modular appchains, Protocol simulations, Open-source protocols, DERs, Energy hardware, Grid operations</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Daylight</Employername>
      <Employerlogo>https://logos.yubhub.co/godaylight.com.png</Employerlogo>
      <Employerdescription>Daylight is a decentralized energy company building a protocol that converts distributed solar and storage systems into yield-bearing infrastructure.</Employerdescription>
      <Employerwebsite>https://www.godaylight.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/daylight/jobs/4775068008</Applyto>
      <Location>New York, NY</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>5b4ebc77-29b</externalid>
      <Title>Global Markets Accounting Lead</Title>
      <Description><![CDATA[<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto. The Global Markets Accounting Lead will play a pivotal role in shaping the future of our rapidly expanding prime services business.</p>
<p>This individual will be responsible for overseeing all brokerage related accounting and reporting functions, ensuring compliance with regulations, and leading the design, documentation, and implementation of policies, processes, and controls. This role will also be responsible for working cross functionally with legal, compliance and other teams within Anchorage as we prepare for swap-dealer registration with the National Futures Association.</p>
<p>The role presents a unique and exciting opportunity to navigate the complexities of trading digital assets, derivatives and engaging in digital asset borrowing and lending</p>
<p><strong>Technical Skills:</strong></p>
<ul>
<li>Demonstrate expertise in US GAAP with a focus on ASC 606 and ASC 815 and regulatory accounting requirements, ensuring accurate application to revenue recognition and accounting for prime brokerage services.</li>
<li>Expertise in accounting and regulatory reporting requirements for swap-dealers or related businesses, including knowledge of NFA filings and development of related procedures</li>
<li>Resolve a wide range of issues in creative ways, suggesting different potential approaches and adapting to new circumstances rapidly.</li>
<li>Established expertise in designing, implementing, and continuously improving daily and month-end prime brokerage accounting practices, ensuring full compliance with GAAP and industry best practices.</li>
<li>Apply advanced skills in Google Sheets to efficiently and reliably work with large data sets.</li>
<li>Proven ability to account for complex matrix organizational structures and intercompany transactions.</li>
</ul>
<p><strong>Complexity and Impact of Work:</strong></p>
<ul>
<li>Lead the accounting team through the intricacies of accounting related to digital asset and OTC derivative prime brokerage activities and products, ensuring alignment with customer contracts as well as evolving regulatory and industry best practices.</li>
<li>Drive the development of scalable and automated financial processes and controls, leveraging your technical expertise to design and implement internal end-to-end trading system reporting, as well as integration of external software tools, as needed.</li>
<li>Perform and/or review month end accounting close responsibilities for brokerage related legal entities, including reviewing journal entries, account reconciliations, variance analysis, and FBO fiat and digital asset account reconciliations.</li>
<li>Provide support for internal and external audits related to specific areas of responsibility.</li>
<li>Create and maintain detailed documentation of core accounting and regulatory reporting processes.</li>
<li>Identify opportunities for improvements and efficiencies.</li>
</ul>
<p><strong>Organizational Knowledge:</strong></p>
<ul>
<li>Cultivate effective cross-functional communication with the brokerage, legal, operations and technical accounting teams to ensure comprehensive documentation and understanding of transactions and their appropriate accounting treatment.</li>
<li>Stay informed and socialize with the broader organization on industry pronouncements, emerging trends, and developments in accounting and reporting for digital assets, SEC regulations, and lending practices.</li>
</ul>
<p><strong>Communication and Influence:</strong></p>
<ul>
<li>Leverage accounting expertise to provide guidance and requirements to product, brokerage operations, legal and engineering teams in designing and implementing new products that meet financial reporting and compliance requirements.</li>
<li>Assist with internal and external audits to ensure accuracy and timeliness of financial reporting.</li>
<li>Interact with regulators such as the NFA and CFTC as we navigate the swap-dealer registration process and ongoing regulatory compliance, including examinations.</li>
</ul>
<p><strong>You may be a fit for this role if you have:</strong></p>
<ul>
<li>10+ years of related accounting experience, with a mix of Big 4 public accounting experience and/or in a fast paced startup/Fintech environment.</li>
<li>Knowledge of blockchain technology and the crypto economy.</li>
<li>Able to work well in a dynamic environment, implement process improvements, work autonomously and handle multiple tasks simultaneously.</li>
<li>Experience accounting for digital assets, prime brokerage services, digital asset borrowing/ lending, and/or derivatives.</li>
<li>NFA Series 3 License or ability to obtain within 6 months.</li>
</ul>
<p><strong>Although not a requirement, bonus points if:</strong></p>
<ul>
<li>Knowledge of blockchain technology and the crypto economy.</li>
<li>You have worked at or with investment advisors or institutional investment funds.</li>
<li>You can write Python scripts, SQL queries or build data integrations into Google Sheets/Excel.</li>
<li>You have SOX compliance experience.</li>
<li>Experience working in Netsuite and G-suite.</li>
<li>Experience accounting for digital asset prime brokerage services.</li>
<li>You were emotionally moved by the soundtrack to Hamilton, which chronicles the founding of a new financial system. :)</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>US GAAP, ASC 606, ASC 815, regulatory accounting requirements, swap-dealer accounting and reporting requirements, NFA filings, Google Sheets, accounting for complex matrix organizational structures, intercompany transactions, blockchain technology, crypto economy, Python scripts, SQL queries, data integrations into Google Sheets/Excel, SOX compliance, Netsuite, G-suite, digital asset prime brokerage services</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Anchorage Digital</Employername>
      <Employerlogo>https://logos.yubhub.co/anchorage.com.png</Employerlogo>
      <Employerdescription>Anchorage Digital is a crypto platform that enables institutions to participate in digital assets through custody, staking, trading, governance, settlement, and the industry&apos;s leading security infrastructure.</Employerdescription>
      <Employerwebsite>https://anchorage.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/anchorage/22726292-f900-4b11-b1af-e5734729c36a</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>7af16166-8fd</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p><strong>Key Responsibilities:</strong></p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
<li>Mentor developers and create reference implementations/frameworks.</li>
<li>Partners with System Architects to elaborate capabilities and features.</li>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Over 6 years of experience as a senior domain architect for Data domains</li>
<li>Advanced English Level</li>
<li>Masters&#39; degree (PLUS)</li>
<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>
</ul>
<p><strong>Technical &amp; Business Skills:</strong></p>
<ul>
<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>
<li>Data Architecture / Data Modeling – Advanced (MUST)</li>
<li>Data Warehouse – Advanced (MUST)</li>
<li>Cloud Data Platforms - Advanced</li>
<li>Data Integration Tools – Advanced</li>
<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>
<li>Any Cloud - Intermediate (4-6 Years)</li>
<li>Power BI or Tableau - Intermediate (4-6 Years)</li>
<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>
<li>Data Lakehouse – Intermediate (MUST)</li>
</ul>
<ul>
<li>Data Governance - Intermediate</li>
<li>AI/ML - Entry Level (PLUS)</li>
<li>Master Data Management - Intermediate</li>
<li>Operational Data Management - Intermediate</li>
</ul>
<p><strong>Benefits:</strong></p>
<p>This position comes with a competitive compensation and benefits package.</p>
<ul>
<li>A competitive salary and performance-based bonuses.</li>
<li>Comprehensive benefits package.</li>
<li>Flexible work arrangements (remote and/or office-based).</li>
<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>
<li>Private Health Insurance.</li>
<li>Paid Time Off.</li>
<li>Training &amp; Development opportunities in partnership with renowned companies.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, AI/ML, Master Data Management, Operational Data Management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global consulting and technology services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/jdUFHSPZZjHsgd3TR4R3BS/remote-fbs-senior-data-domain-architect-in-colombia-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>2a56a653-c18</externalid>
      <Title>Palantir Engineer Specialist - Sr. Consultant - Principal</Title>
      <Description><![CDATA[<p><strong>Palantir Engineer Specialist</strong></p>
<p><strong>Sr. Consultant - Principal</strong></p>
<p><strong>London</strong></p>
<p>Do you want to boost your career and collaborate with expert, talented colleagues to solve and deliver against our clients&#39; most important challenges? We are growing and are looking for people to join our team. You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organisation allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset. Are you ready?</p>
<p><strong>About Your Role</strong></p>
<p>As a <strong>Senior Consultant / Principal Consultant – Palantir Engineer</strong>, you lead and deliver end-to-end, data-driven solutions using <strong>Palantir Foundry</strong> in complex client environments. You operate at the intersection of engineering, data, and consulting, working closely with business and technical stakeholders to translate complex problems into scalable, production-ready solutions. You combine strong hands-on technical skills with a consulting mindset, taking ownership of solution design, implementation, and adoption across organisations.</p>
<p><strong>Your role will include:</strong></p>
<ul>
<li>Own the <strong>end-to-end delivery</strong> of Palantir Foundry–based solutions, from problem definition to production</li>
<li>Design and implement <strong>data pipelines and transformations</strong> across diverse data sources</li>
<li>Model data using <strong>Foundry Ontology</strong> concepts to support analytics and operational use cases</li>
<li>Build scalable, reliable solutions using <strong>Python, SQL, and PySpark</strong> within Foundry</li>
<li>Collaborate closely with business stakeholders to define requirements, success metrics, and roadmaps</li>
<li>Support <strong>prototyping, productionisation, and scaling</strong> of data-driven applications</li>
<li>Ensure solutions meet requirements for <strong>data quality, governance, security, and performance</strong></li>
<li>Act as a technical advisor within project teams and contribute to best practices</li>
</ul>
<p><strong>Requirements</strong></p>
<p><strong>What you bring – required</strong></p>
<p><strong>Experience &amp; Seniority</strong></p>
<ul>
<li>Proven experience as a <strong>Senior Consultant or Principal Consultant</strong> in data, analytics, or platform engineering</li>
<li>Strong experience delivering <strong>client-facing data solutions</strong> in complex environments</li>
<li>Ability to take ownership and work independently in ambiguous problem spaces</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong programming skills in <strong>Python</strong> and <strong>SQL</strong>; <strong>PySpark</strong> experience required</li>
<li>Hands-on experience with <strong>Palantir Foundry</strong>, including:</li>
<li>Pipeline Builder / Code Workbook</li>
<li>Data integration and transformation</li>
<li>Ontology modelling and data lineage</li>
<li>Solid understanding of <strong>data architectures</strong>, including data lakes, lakehouses, and data warehouses</li>
<li>Experience working with APIs, databases, and structured / semi-structured data</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience building <strong>scalable ETL/ELT pipelines</strong></li>
<li>Familiarity with <strong>CI/CD concepts</strong>, testing, and production deployments</li>
<li>Strong focus on <strong>solution quality, maintainability, and performance</strong></li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field <strong>or equivalent practical experience</strong></li>
</ul>
<p><strong>Nice to have</strong></p>
<ul>
<li>Experience with <strong>cloud platforms</strong> (AWS, Azure, GCP)</li>
<li>Familiarity with <strong>containerisation</strong> (Docker, Kubernetes)</li>
<li>Prior experience as a <strong>Palantir FDE</strong> or in Foundry-heavy delivery roles</li>
<li>Domain experience in industries such as <strong>Energy, Finance, Public Sector, Healthcare, or Logistics</strong></li>
</ul>
<p><strong>Benefits</strong></p>
<p><strong>About your team</strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics strategy, Data Management &amp; Governance, Data Platforms &amp; engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognised as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognised by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, SQL, PySpark, Palantir Foundry, Pipeline Builder, Code Workbook, Data integration, Data transformation, Ontology modelling, Data lineage, Data architectures, Data lakes, Lakehouses, Data warehouses, APIs, Databases, Structured data, Semi-structured data, ETL/ELT pipelines, CI/CD concepts, Testing, Production deployments, Solution quality, Maintainability, Performance, Bachelor’s degree, Master’s degree, Computer Science, Engineering, Mathematics, Cloud platforms, Containerisation, Palantir FDE, Foundry-heavy delivery roles, Domain experience in industries such as Energy, Finance, Public Sector, Healthcare, or Logistics</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. The company is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/2A8U1ryerVijb4fFAc6i8u/hybrid-palantir-engineer-specialist---sr.-consultant---principal-in-london-at-infosys-consulting---europe</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7b03b30a-b20</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>
<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p>**Key Responsibilities:*</p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
</ul>
<ul>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
</ul>
<ul>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
</ul>
<ul>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
</ul>
<ul>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
</ul>
<ul>
<li>Mentor developers and create reference implementations/frameworks.</li>
</ul>
<ul>
<li>Partners with System Architects to elaborate capabilities and features.</li>
</ul>
<ul>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, Master Data Management, Operational Data Management, AI/ML</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/1U952YA2QBa8zK7Tm5d3Lm/remote-fbs-senior-data-domain-architect-in-mexico-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>dcfed817-412</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Senior Data Domain Architect to join our team. As a Senior Data Domain Architect, you will design and develop Data/Domain IT architecture solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Key Responsibilities:</strong></p>
<ul>
<li>Utilize in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery</li>
<li>Solve complex problems and partner effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization</li>
<li>Work independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge</li>
<li>Review high level design to ensure alignment to Solution Architecture</li>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives</li>
<li>Mentor developers and create reference implementations/frameworks</li>
<li>Partner with System Architects to elaborate capabilities and features</li>
<li>Deliver single domain architecture solutions and execute continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Over 6 years of experience as a senior domain architect for Data domains</li>
<li>Advanced English Level</li>
<li>Masters&#39; degree (PLUS)</li>
<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>
</ul>
<p><strong>Technical &amp; Business Skills:</strong></p>
<ul>
<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>
<li>Data Architecture / Data Modeling – Advanced (MUST)</li>
<li>Data Warehouse – Advanced (MUST)</li>
<li>Cloud Data Platforms - Advanced</li>
<li>Data Integration Tools – Advanced</li>
<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>
<li>Any Cloud - Intermediate (4-6 Years)</li>
<li>Power BI or Tableau - Intermediate (4-6 Years)</li>
<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>
<li>Data Lakehouse – Intermediate (MUST)</li>
</ul>
<p><strong>Benefits:</strong></p>
<ul>
<li>A competitive salary and performance-based bonuses</li>
<li>Comprehensive benefits package</li>
<li>Flexible work arrangements (remote and/or office-based)</li>
<li>Private Health Insurance</li>
<li>Paid Time Off</li>
<li>Training &amp; Development opportunities in partnership with renowned companies</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/x7tKXYFBB815ca6oBV5T2E/remote-fbs-senior-data-domain-architect-in-brazil-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>ee2fcbdc-fc4</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will act as a senior technical leader in complex data and analytics engagements, shaping and governing end-to-end enterprise data architectures, leading technical teams, and serving as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will be responsible for ensuring that enterprise data and analytics solutions are scalable, secure, and production-ready, while translating business requirements into robust technical designs and delivery roadmaps.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, Azure, AWS or GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, Postgres, SQL Server, Oracle, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, Docker / Kubernetes, Advanced analytics, AI / ML or GenAI, Streaming platforms (e.g. Kafka, Azure Event Hubs), Data governance or metadata tools, Cloud, data, or architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. The company is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/uuSzzCt8qNbo6UpEFkSyjY/hybrid-principal-consultant---data-architecture-in-london-at-infosys-consulting---europe</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>56dc9a51-e66</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will act as a senior technical leader in complex data and analytics engagements. You will shape and govern end-to-end enterprise data architectures, lead technical teams, and serve as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>You will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>enterprise data architecture, system data integration, data engineering, analytics, modern data architectures, Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, cloud data platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, SQL, relational databases, Postgres, SQL Server, Oracle, NoSQL databases, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, data migration programmes, data pipelines, orchestration, automation, CI/CD concepts, production-grade deployments, distributed systems, Docker, Kubernetes, data management and governance principles, data quality, metadata, lineage, master data management, data management software and tools, security, access control, compliance considerations, Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience, advanced analytics, AI / ML or GenAI, streaming platforms, Kafka, Azure Event Hubs, data governance or metadata tools, cloud, data, architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. It is a mid-size player with a supportive, entrepreneurial spirit.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/hpBWjvvy8D6B1f818cHxZR/remote-principal-consultant---data-architecture-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>01be118d-100</externalid>
      <Title>Palantir Engineer Specialist - Sr. Consultant - Principal</Title>
      <Description><![CDATA[<p><strong><strong>About Your Role</strong></strong></p>
<p>As a Senior Consultant / Principal Consultant – Palantir Engineer, you will lead and deliver end-to-end, data-driven solutions using Palantir Foundry in complex client environments. You will operate at the intersection of engineering, data, and consulting, working closely with business and technical stakeholders to translate complex problems into scalable, production-ready solutions.</p>
<p><strong><strong>Your role will include:</strong></strong></p>
<ul>
<li>Own the end-to-end delivery of Palantir Foundry–based solutions, from problem definition to production</li>
<li>Design and implement data pipelines and transformations across diverse data sources</li>
<li>Model data using Foundry Ontology concepts to support analytics and operational use cases</li>
<li>Build scalable, reliable solutions using Python, SQL, and PySpark within Foundry</li>
<li>Collaborate closely with business stakeholders to define requirements, success metrics, and roadmaps</li>
<li>Support prototyping, productionisation, and scaling of data-driven applications</li>
<li>Ensure solutions meet requirements for data quality, governance, security, and performance</li>
<li>Act as a technical advisor within project teams and contribute to best practices</li>
</ul>
<p><strong><strong>Requirements</strong></strong></p>
<ul>
<li>Proven experience as a Senior Consultant or Principal Consultant in data, analytics, or platform engineering</li>
<li>Strong experience delivering client-facing data solutions in complex environments</li>
<li>Ability to take ownership and work independently in ambiguous problem spaces</li>
</ul>
<p><strong><strong>Core Data &amp; Analytics Technology Skills</strong></strong></p>
<ul>
<li>Strong programming skills in Python and SQL; PySpark experience required</li>
<li>Hands-on experience with Palantir Foundry, including:</li>
<li>Pipeline Builder / Code Workbook</li>
<li>Data integration and transformation</li>
<li>Ontology modelling and data lineage</li>
<li>Solid understanding of data architectures, including data lakes, lakehouses, and data warehouses</li>
<li>Experience working with APIs, databases, and structured / semi-structured data</li>
</ul>
<p><strong><strong>Engineering &amp; Platform Foundations</strong></strong></p>
<ul>
<li>Experience building scalable ETL/ELT pipelines</li>
<li>Familiarity with CI/CD concepts, testing, and production deployments</li>
<li>Strong focus on solution quality, maintainability, and performance</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong><strong>Nice to have</strong></strong></p>
<ul>
<li>Experience with cloud platforms (AWS, Azure, GCP)</li>
<li>Familiarity with containerisation (Docker, Kubernetes)</li>
<li>Prior experience as a Palantir FDE or in Foundry-heavy delivery roles</li>
<li>Domain experience in industries such as Energy, Finance, Public Sector, Healthcare, or Logistics</li>
</ul>
<p><strong><strong>Language &amp; Mobility</strong></strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong><strong>Benefits</strong></strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice, you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role, you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics strategy, Data Management &amp; Governance, Data Platforms &amp; engineering, Analytics &amp; Data Science.</p>
<p><strong><strong>About listing company</strong></strong></p>
<p>Infosys Consulting is a globally renowned management consulting firm that is on the front-line of industry disruption. We are a mid-size player with a supportive, entrepreneurial spirit that works with a market-leading brand in every sector, while our parent organization Infosys is a top-5 powerhouse IT brand that is outperforming the market and experiencing rapid growth.</p>
<p>Our consulting business is annually recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths we offer to our consultants. We are committed to fostering an inclusive work culture that inspires everyone to deliver their best.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, SQL, PySpark, Palantir Foundry, Pipeline Builder / Code Workbook, Data integration and transformation, Ontology modelling and data lineage, Data architectures, APIs, Databases, Structured / semi-structured data, Cloud platforms, Containerisation, Palantir FDE, Foundry-heavy delivery roles, Domain experience in industries such as Energy, Finance, Public Sector, Healthcare, or Logistics</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting is a globally renowned management consulting firm that works with market-leading brands across sectors. It is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/2u6mMfyRc8Yxg8qmvZBSMX/remote-palantir-engineer-specialist---sr.-consultant---principal-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>bc13ad37-8f9</externalid>
      <Title>ACT Integration Solutions Specialist – Aladdin Platform, Associate</Title>
      <Description><![CDATA[<p>About this role</p>
<p>We are seeking a dynamic Specialist with a proven track record in data integration and data conversion to join our growing Integration Solutions Practice. This role is pivotal in scaling our projects across various asset classes and front-to-back Aladdin services, employing Aladdin’s Studio platform and its robust capabilities in data integration and onboarding – which include file-based mechanisms, APIs, and the Aladdin Data Cloud. This is a client-facing position offering an opportunity to collaborate with a broad range of internal stakeholders at BlackRock.</p>
<p>Main Function</p>
<p>As an Integration Solutions Specialist for the Aladdin Platform, you will play a crucial role on Aladdin client projects in orchestrating specific technology channels that cover client activities related to data integration and data conversion or onboarding. You will be responsible for driving our project channels, providing best practice guidance to our clients’ technology teams, and supporting those teams as they build their integration and transition their data to our platform. You will work closely with the core Aladdin implementation teams to deliver on client commitments. Your expertise will be instrumental in securing an on-time and on-budget project outcome.</p>
<p>Responsibilities</p>
<p>Your key responsibilities will include:</p>
<ul>
<li><p>Own and manage Aladdin channels across Data Interfaces, Data Conversions, and Data Cloud (ADC) Deployments</p>
</li>
<li><p>Liaise with relevant client counterparts, including senior technology managers responsible for delivering the required integration and migration builds</p>
</li>
<li><p>Work with client teams to create the necessary plans and trackers, and coordinate resources and updates</p>
</li>
<li><p>Lead future state architecture sessions with client teams, to drive project scope and identify challenges</p>
</li>
<li><p>Lead and assist in interface design and data mapping sessions, support client development needs</p>
</li>
<li><p>Support the enablement of our clients working with Aladdin Studio technology and Aladdin data</p>
</li>
<li><p>Gain an in-depth knowledge of Aladdin functionality and workflows to ensure the correct integration solutions are deployed and utilized</p>
</li>
<li><p>Work closely with partners across the Aladdin business and platform to support client development needs</p>
</li>
</ul>
<p>Preferred Qualifications</p>
<ul>
<li><p>Bachelor’s or Master’s degree in engineering, Computer Sciences, Mathematics, or a related quantitative field</p>
</li>
<li><p>Fluency in English and excellent communication skills, with the ability to convey concepts clearly and simply</p>
</li>
<li><p>Experience with financial systems integration, cloud-based computing, or automation tools and methodologies</p>
</li>
<li><p>Understanding of finance and public or private markets, the investment lifecycle, and associated workflows</p>
</li>
<li><p>Demonstrated problem-solving prowess and a proactive mindset</p>
</li>
<li><p>Well-organized with strong project management capabilities</p>
</li>
<li><p>Ability to work collaboratively and independently within a team environment</p>
</li>
<li><p>Proficiency in SQL and a working knowledge of ETL processes is a must have</p>
</li>
<li><p>Familiarity with programming languages such as Python or R, APIs, and data transformation tools such as Alteryx Designer, is highly regarded</p>
</li>
</ul>
<p>Our benefits</p>
<p>To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment and tools designed to help you in building a sound financial future; access to education reimbursement; comprehensive resources to support your physical health and emotional well-being; family support programs; and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>
<p>Our hybrid work model</p>
<p>BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.</p>
<p>About BlackRock</p>
<p>At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data integration, data conversion, Aladdin Studio platform, SQL, ETL processes, Python, R, APIs, data transformation tools, financial systems integration, cloud-based computing, automation tools and methodologies, Alteryx Designer</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>BlackRock</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>BlackRock is a global investment management company that provides a range of investment products and services to institutional and individual investors.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/g8YUpXA9aV6CnPMQkdnvSM/act-integration-solutions-specialist-%E2%80%93-aladdin-platform%2C-associate-in-edinburgh-at-blackrock</Applyto>
      <Location>Edinburgh</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>8e385f8d-94b</externalid>
      <Title>Technical Writer, Aladdin Studio, Associate</Title>
      <Description><![CDATA[<p><strong>About this role</strong></p>
<p>We&#39;re seeking an experienced Technical Writer to help define and deliver world-class documentation for the Aladdin Studio developer platform. Aladdin Studio is transforming how developers and data engineers interact with the Aladdin ecosystem, enabling teams to build, integrate, and extend the Aladdin platform through open APIs and event-driven workflows.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Develop and maintain clear, accurate, and engaging documentation for Aladdin Studio&#39;s APIs, SDKs, and event streaming interfaces.</li>
<li>Collaborate closely with engineers, product managers, and developer experience teams to translate complex technical concepts into approachable guides, tutorials, and reference materials.</li>
<li>Document event-driven workflows, including streaming APIs, webhook subscriptions, and real-time data integration patterns.</li>
<li>Design and implement content structures that scale across multiple APIs, microservices, and event channels within Aladdin Studio.</li>
<li>Contribute to tooling and automation, using OpenAPI/AsyncAPI specs and CI/CD pipelines to generate and version developer documentation.</li>
<li>Partner with Studio Product Marketing and Solution Architecture team to create onboarding materials, sample code, and “getting started” experiences for external and internal developers.</li>
<li>Continuously improve the discoverability and usability of content within the Aladdin Studio Developer Portal.</li>
<li>Champion documentation standards across the Aladdin Product Group, ensuring consistency, clarity, and technical accuracy.</li>
</ul>
<p><strong>Required Qualifications</strong></p>
<ul>
<li>2+ years of experience as a Technical Writer, Developer Advocate, or Software Engineer focused on APIs or event-driven systems.</li>
<li>Deep understanding of RESTful and event-streaming architectures (Real-time distributed event streaming platform, Kinesis, or similar).</li>
<li>Proven experience writing API and developer documentation using OpenAPI/Swagger or AsyncAPI specifications.</li>
<li>Hands-on familiarity with developer tooling such as Git, Postman, Redocly, or similar platforms.</li>
<li>Strong grasp of cloud-based integration concepts, including authentication, webhooks, and event publishing/subscription models.</li>
<li>Excellent written communication skills and ability to translate complex systems into developer-friendly content.</li>
<li>Proficiency in Markdown, YAML, and basic scripting (Python, JavaScript, or similar).</li>
</ul>
<p><strong>What We Offer</strong></p>
<ul>
<li>Opportunity to shape the developer experience for the Aladdin ecosystem, a platform trusted by the world’s largest financial institutions.</li>
<li>A collaborative, growth-oriented environment within BlackRock’s Aladdin Product Group.</li>
<li>Competitive compensation, benefits, and professional development opportunities.</li>
<li>Direct impact on how external developers and partners extend Aladdin’s capabilities through APIs and streaming data.</li>
</ul>
<p><strong>Our benefits</strong></p>
<p>To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment and tools designed to help you in building a sound financial future; access to education reimbursement; comprehensive resources to support your physical health and emotional well-being; family support programs; and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.</p>
<p><strong>Our hybrid work model</strong></p>
<p>BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>API documentation, event streaming, developer experience, OpenAPI/Swagger, AsyncAPI, Git, Postman, Redocly, Markdown, YAML, Python, JavaScript, data integration, analytics, asset and risk management, streaming data pipelines, real-time analytics, API lifecycle management, continuous documentation delivery</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>BlackRock</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>BlackRock is a global investment management company that provides a range of investment products and services to institutional and individual investors.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/gyzyNbsJN1TLzb2rwyj4yv/technical-writer%2C-aladdin-studio%2C-associate-in-edinburgh-at-blackrock</Applyto>
      <Location>Edinburgh, Scotland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>62a97c66-49a</externalid>
      <Title>Software Engineer (GenAI &amp; Data Engineering)</Title>
      <Description><![CDATA[<p>Join our Data Architecture team to design and deliver AI-driven solutions and data engineering capabilities that power smarter decision-making and accelerate innovation across the business.</p>
<p>This Junior Software Engineer role combines generative AI expertise with strong data engineering skills, working on cutting-edge technologies that support engineering and data science initiatives.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Develop generative AI solutions for knowledge processing, including Retrieval-Augmented Generation (RAG) pipelines and agentic AI systems</li>
<li>Build and maintain data engineering pipelines to support AI and analytics workloads</li>
<li>Collaborate with engineering teams to embed AI and data best practices into development processes</li>
<li>Implement and help define coding standards and patterns for AI and data-driven applications</li>
<li>Solve complex problems using innovative AI frameworks and scalable data solutions</li>
<li>Actively integrate generative AI tools and models into daily workflows</li>
<li>Stay ahead of emerging AI and data engineering trends and apply them to real-world solutions</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Experience in generative AI and data engineering</li>
<li>Skilled in Python and C# .NET, with a focus on AI and data integration</li>
<li>Solid understanding of modern software engineering principles and their application in AI and data-driven solutions</li>
<li>Extensively use generative AI in your day-to-day work to accelerate development and problem-solving</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Be supported to continually learn and improve your tech skills</li>
<li>Share knowledge and ideas in the team</li>
<li>Be part of a dynamic and open culture</li>
<li>Enjoy a supportive environment that encourages growth and learning</li>
<li>Outstanding flexibility to support a healthy work-life balance</li>
<li>Hybrid working (home and office-based split)</li>
<li>Medical and Life insurance</li>
<li>Volunteer day, enhanced paid parental leave and wellness benefits</li>
<li>Fun team events including the Vista Innovation Cup</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>generative AI, data engineering, Python, C# .NET, AI and data integration, modern software engineering principles</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Vista Group</Employername>
      <Employerlogo>https://logos.yubhub.co/j.com.png</Employerlogo>
      <Employerdescription>Vista Group is a technology company that provides software solutions for the entertainment industry.</Employerdescription>
      <Employerwebsite>https://apply.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/912BDF1CA3</Applyto>
      <Location>Auckland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>20b8bc18-582</externalid>
      <Title>Head of Rolling Stock Systems</Title>
      <Description><![CDATA[<p>Take your career in a new direction at Eurostar, a leading rail operator in five countries. As Head of Rolling Stock Systems, you will oversee the management and improvement of depot systems and onboard tools, ensuring seamless integration and utilisation of data across depot systems. This role is responsible for developing and managing a clear product roadmap aligned with organisational priorities and staff needs.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Lead the end-to-end strategy, delivery, and continuous improvement of Depot and Onboard systems</li>
<li>Develop and manage a clear product roadmap aligned with organisational priorities and staff needs</li>
<li>Support and manage a team of 5 (including contractors), which may increase in the future as more projects come in</li>
<li>Collaborate with engineering, operations, and cross-functional teams to ensure system integration, performance, and alignment</li>
<li>Drive transformation by identifying process and technology improvements and leading effective change management</li>
<li>Oversee the integration, quality, and accessibility of data across all depot systems, enabling data-driven decision making</li>
<li>Monitor key performance metrics and use insights to optimise system effectiveness and operational outcomes</li>
<li>Build strong relationships with stakeholders and communicate progress, challenges, and achievements clearly</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Master&#39;s degree in engineering, information systems, or a related field</li>
<li>Proven experience in product ownership or product management, ideally within rolling stock, transportation, or industrial systems</li>
<li>Minimum of 5 years&#39; experience in a similar role</li>
<li>Strong strategic thinking, product management expertise, and change leadership skills</li>
<li>Solid knowledge of data integration and analysis, continuous improvement, and cross-functional collaboration</li>
<li>Fluency in English and French is essential for this role</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Travel benefits that can be used for both work and play, including 75% off underground network from Day 1</li>
<li>Competitive defined benefit pension scheme</li>
<li>Free Eurostar tickets</li>
<li>Discounted Eurostar tickets for friends and family</li>
<li>Ongoing training and development</li>
<li>Lots of other exclusive deals, discounts, and perks</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>product management, data integration, continuous improvement, cross-functional collaboration, change leadership, strategic thinking, product ownership, information systems, industrial systems</Skills>
      <Category>Engineering</Category>
      <Industry>Transportation</Industry>
      <Employername>Eurostar</Employername>
      <Employerlogo>https://logos.yubhub.co/j.com.png</Employerlogo>
      <Employerdescription>Eurostar operates high-speed trains connecting five countries and is a significant player in the rail industry.</Employerdescription>
      <Employerwebsite>https://apply.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/3CB670BC59</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>0310fc87-24d</externalid>
      <Title>Incentive Compensation System Engineer</Title>
      <Description><![CDATA[<p>As an Incentive Compensation Systems Engineer, you will lead the technical implementation, configuration, and optimization of our incentive compensation management (ICM) platform. You will work closely with Revenue Operations, Data Engineering, Finance, Accounting, and HR to translate incentive compensation designs into scalable system configurations, ensuring accurate calculations, reliable integrations, and efficient workflows.</p>
<p><strong>Responsibilities:</strong></p>
<p><strong>ICM Platform Implementation &amp; Configuration</strong></p>
<ul>
<li>Lead the end-to-end implementation and configuration of our ICM platform, including plan modeling, crediting rules, quota management, and payout calculations</li>
<li>Translate compensation plan documents into system logic, ensuring accurate and auditable calculation outputs</li>
<li>Design and maintain system workflows for plan changes, exceptions, and adjustments</li>
<li>Develop and execute testing strategies to validate system accuracy before each compensation cycle</li>
<li>Provide technical guidance during plan design discussions, advising on system capabilities and constraints</li>
</ul>
<p><strong>Data Integration &amp; Architecture</strong></p>
<ul>
<li>Build and maintain integrations between the ICM platform and source systems (CRM, ERP, HRIS, billing platforms)</li>
<li>Ensure data integrity across the compensation data pipeline, from opportunity data through final payout</li>
<li>Partner with Data Engineering and Revenue Operations to define data requirements and resolve data quality issues</li>
<li>Document data flows, transformation logic, and system dependencies</li>
</ul>
<p><strong>System Administration &amp; Optimization</strong></p>
<ul>
<li>Serve as the primary system administrator for the ICM platform, managing user access, security, and system health</li>
<li>Identify opportunities to streamline and automate compensation processes, reducing manual intervention and cycle time</li>
<li>Monitor system performance and troubleshoot calculation errors or integration failures</li>
<li>Evaluate and recommend new technologies, including Claude-powered solutions, to enhance system capabilities</li>
</ul>
<p><strong>Reporting &amp; Analytics</strong></p>
<ul>
<li>Build and maintain reporting infrastructure within the ICM platform to support compensation dashboards and attainment tracking</li>
<li>Partner with Revenue Operations and Accounting to deliver accurate, timely compensation data for forecasting and accruals</li>
<li>Enable self-service reporting for compensation administrators and business partners</li>
</ul>
<p><strong>You may be a good fit if you have:</strong></p>
<ul>
<li>A passion for Anthropic&#39;s mission to build safe, transformative AI systems</li>
<li>5+ years of experience implementing or administering ICM platforms (e.g., Xactly, Varicent, Anaplan)</li>
<li>Strong technical skills with experience in SQL, data integration tools, and system configuration</li>
<li>Proven ability to translate business requirements into system logic and maintain accurate, auditable calculations</li>
<li>Experience managing data pipelines and ensuring data quality across integrated systems</li>
<li>Strong problem solving skills with a systematic approach to debugging and root cause analysis</li>
<li>Excellent documentation practices and ability to communicate technical concepts to non-technical stakeholders</li>
<li>Bachelor&#39;s degree in a technical or quantitative field preferred</li>
</ul>
<p><strong>Strong candidates may also have:</strong></p>
<ul>
<li>Experience with full ICM platform implementation projects, including vendor selection and data migration</li>
<li>Proficiency with APIs and integration platforms (e.g., Workato, Fivetran, custom ETL)</li>
<li>Experience with Salesforce administration or development</li>
<li>Familiarity with consumption-based compensation models</li>
<li>Experience with Claude Code or other AI-assisted development tools</li>
<li>Background in sales operations, revenue operations, or finance systems</li>
</ul>
<p>The annual compensation range for this role is $190,000 - $270,000 USD.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$190,000 - $270,000 USD</Salaryrange>
      <Skills>ICM platform implementation, SQL, data integration tools, system configuration, data pipeline management, problem solving, documentation, communication, full ICM platform implementation, APIs and integration platforms, Salesforce administration, consumption-based compensation models, Claude Code, sales operations, revenue operations, finance systems</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a rapidly growing organisation focused on creating reliable, interpretable, and steerable AI systems. The company aims to build safe and beneficial AI systems for users and society as a whole.</Employerdescription>
      <Employerwebsite>https://job-boards.greenhouse.io</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5042002008</Applyto>
      <Location>San Francisco, CA | New York City, NY</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>25829001-a4a</externalid>
      <Title>Incentive Compensation System Engineer</Title>
      <Description><![CDATA[<p>As an Incentive Compensation Systems Engineer, you will lead the technical implementation, configuration, and optimization of our incentive compensation management (ICM) platform. You will work closely with Revenue Operations, Data Engineering, Finance, Accounting, and HR to translate incentive compensation designs into scalable system configurations, ensuring accurate calculations, reliable integrations, and efficient workflows.</p>
<p><strong>Responsibilities:</strong></p>
<p><strong>ICM Platform Implementation &amp; Configuration</strong></p>
<ul>
<li>Lead the end-to-end implementation and configuration of our ICM platform, including plan modeling, crediting rules, quota management, and payout calculations</li>
<li>Translate compensation plan documents into system logic, ensuring accurate and auditable calculation outputs</li>
<li>Design and maintain system workflows for plan changes, exceptions, and adjustments</li>
<li>Develop and execute testing strategies to validate system accuracy before each compensation cycle</li>
<li>Provide technical guidance during plan design discussions, advising on system capabilities and constraints</li>
</ul>
<p><strong>Data Integration &amp; Architecture</strong></p>
<ul>
<li>Build and maintain integrations between the ICM platform and source systems (CRM, ERP, HRIS, billing platforms)</li>
<li>Ensure data integrity across the compensation data pipeline, from opportunity data through final payout</li>
<li>Partner with Data Engineering and Revenue Operations to define data requirements and resolve data quality issues</li>
<li>Document data flows, transformation logic, and system dependencies</li>
</ul>
<p><strong>System Administration &amp; Optimization</strong></p>
<ul>
<li>Serve as the primary system administrator for the ICM platform, managing user access, security, and system health</li>
<li>Identify opportunities to streamline and automate compensation processes, reducing manual intervention and cycle time</li>
<li>Monitor system performance and troubleshoot calculation errors or integration failures</li>
<li>Evaluate and recommend new technologies, including Claude-powered solutions, to enhance system capabilities</li>
</ul>
<p><strong>Reporting &amp; Analytics</strong></p>
<ul>
<li>Build and maintain reporting infrastructure within the ICM platform to support compensation dashboards and attainment tracking</li>
<li>Partner with Revenue Operations and Accounting to deliver accurate, timely compensation data for forecasting and accruals</li>
<li>Enable self-service reporting for compensation administrators and business partners</li>
</ul>
<p><strong>You may be a good fit if you have:</strong></p>
<ul>
<li>A passion for Anthropic&#39;s mission to build safe, transformative AI systems</li>
<li>5+ years of experience implementing or administering ICM platforms (e.g., Xactly, Varicent, Anaplan)</li>
<li>Strong technical skills with experience in SQL, data integration tools, and system configuration</li>
<li>Proven ability to translate business requirements into system logic and maintain accurate, auditable calculations</li>
<li>Experience managing data pipelines and ensuring data quality across integrated systems</li>
<li>Strong problem solving skills with a systematic approach to debugging and root cause analysis</li>
<li>Excellent documentation practices and ability to communicate technical concepts to non-technical stakeholders</li>
<li>Bachelor&#39;s degree in a technical or quantitative field preferred</li>
</ul>
<p><strong>Strong candidates may also have:</strong></p>
<ul>
<li>Experience with full ICM platform implementation projects, including vendor selection and data migration</li>
<li>Proficiency with APIs and integration platforms (e.g., Workato, Fivetran, custom ETL)</li>
<li>Experience with Salesforce administration or development</li>
<li>Familiarity with consumption-based compensation models</li>
<li>Experience with Claude Code or other AI-assisted development tools</li>
<li>Background in sales operations, revenue operations, or finance systems</li>
</ul>
<p><strong>Logistics</strong></p>
<p><strong>Education requirements:</strong> We require at least a Bachelor&#39;s degree in a related field or equivalent experience. <strong>Location-based hybrid policy:</strong> Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices. <strong>Visa sponsorship:</strong> We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this. <strong>We encourage you to apply even if you do not believe you meet every single qualification.</strong> Not all strong candidates will meet every single qualification as listed. Research shows that people who identify as being from underrepresented groups are more prone to experiencing imposter syndrome and doubting the strength of their candidacy, so we urge you not to exclude yourself prematurely and to submit an application if you&#39;re interested in this work. <strong>Your safety matters to us.</strong> To protect yourself from potential scams, remember that Anthropic recruiters only contact you from @anthropic.com email addresses. In some cases, we may partner with vetted recruiting agencies who will identify themselves as working on behalf of Anthropic. Be cautious of emails from other domains. Legitimate Anthropic recruiters will never ask for money, fees, or banking information.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, data integration tools, system configuration, ICM platforms, Xactly, Varicent, Anaplan, APIs, integration platforms, Workato, Fivetran, custom ETL, Salesforce administration, development, consumption-based compensation models, Claude Code, AI-assisted development tools, sales operations, revenue operations, finance systems</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a rapidly growing organisation that aims to create reliable, interpretable, and steerable AI systems. The company has a team of researchers, engineers, policy experts, and business leaders working together to build beneficial AI systems.</Employerdescription>
      <Employerwebsite>https://job-boards.greenhouse.io</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5141849008</Applyto>
      <Location>Ontario</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>8ae6102f-700</externalid>
      <Title>GRC Automation Engineering Lead</Title>
      <Description><![CDATA[<p><strong>About the Role</strong></p>
<p>We are seeking a GRC Automation Lead to join our GRC organisation and build the technical foundation for how we scale our risk and compliance programs. In this role, you will lead the team that designs and implements automated workflows, data pipelines, and integrations that transform manual compliance processes into scalable engineering systems.</p>
<p>This is a greenfield opportunity to establish the team, architecture, and integrations that will define how we approach governance, risk, and compliance at Anthropic. The core challenge is a data problem: compliance information lives across dozens of systems—cloud infrastructure, identity providers, HR platforms, ticketing tools, code repositories—and your job is to design systems that bring it together, normalise it, and make it actionable.</p>
<p>At Anthropic, you&#39;ll also have a unique advantage: the ability to design AI-powered workflows where Claude acts as an extension of your team, handling tasks that would traditionally require additional headcount or manual effort. You&#39;ll need ingenuity to identify where agentic AI can accelerate evidence collection, interpret unstructured data, triage compliance gaps, and augment human judgment in risk assessments.</p>
<p>Working closely with Security, IT, and Engineering teams, you&#39;ll translate compliance and regulatory requirements into solutions that support audit programs including SOC 2, ISO, HIPAA, and FedRAMP, building systems that combine traditional automation with AI capabilities to achieve scale that wouldn&#39;t otherwise be possible.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Lead the team that establishes foundational GRC processes and architecture. Design and build automated workflows for risk management and compliance, creating scalable systems that enable continuous monitoring as Anthropic grows.</li>
</ul>
<ul>
<li>Build data pipelines that aggregate risk, control, and asset information from across our technology stack. This means solving hard data integration problems: mapping disparate schemas, handling inconsistent data quality, and creating unified views of compliance posture through dashboards and reporting tools.</li>
</ul>
<ul>
<li>Inform GRC platform strategy and implementation: in partnership with other programs, evaluate, select, and deploy tooling that meets our compliance requirements.</li>
</ul>
<ul>
<li>Translate written policies and compliance requirements into policy-as-code—working with Engineering and Security teams to express requirements as enforceable rules, automated checks, and continuous validation rather than static documents.</li>
</ul>
<ul>
<li>Establish feedback loops between policy and implementation: surface where technical controls diverge from written requirements, identify where policies need to evolve based on infrastructure realities, and ensure that compliance requirements are expressed in terms engineers can act on.</li>
</ul>
<ul>
<li>Design and deploy agentic AI workflows that extend team capacity, using Claude to automate evidence analysis, monitor control effectiveness, draft audit responses, interpret policy documents, and handle other tasks that require reasoning over unstructured information.</li>
</ul>
<ul>
<li>Design and maintain integrations connecting GRC tooling with cloud infrastructure, identity management systems, HRIS platforms, ticketing systems, version control, and CI/CD pipelines—working with engineers to implement integrations that enable automated evidence collection and continuous compliance validation.</li>
</ul>
<ul>
<li>Build and lead the GRC Automation function as we scale: hiring team members, establishing practices, and defining the technical roadmap for governance and compliance automation at Anthropic.</li>
</ul>
<p><strong>You may be a good fit if you:</strong></p>
<ul>
<li>Have 3-4+ years of experience managing technical individual contributors or systems-focused teams, with a proven track record of building or scaling small teams (2-5 people) in security, compliance, automation, or operations functions.</li>
</ul>
<ul>
<li>Are a systems thinker first. You understand how complex environments work: how data flows between systems, where integration points exist, what breaks when systems don&#39;t talk to each other. Your strength is designing the right architecture and environment for security monitoring, not necessarily implementing it yourself.</li>
</ul>
<ul>
<li>Have 5+ years of experience designing automated workflows, data pipelines, or system integrations, whether through traditional development, low-code platforms, GRC tools, or process automation. We care about your ability to solve integration problems, not your programming language proficiency.</li>
</ul>
<ul>
<li>Proficiency to write production level code in at least one programming language (e.g., Python, Rust, Go)</li>
</ul>
<ul>
<li>Have a relentless focus on data integration: you understand how to pull data from multiple sources, normalise it, join it meaningfully, and surface insights. You&#39;re comfortable reasoning about messy, inconsistent data and designing systems that handle edge cases gracefully.</li>
</ul>
<ul>
<li>Understand APIs and integration patterns conceptually: REST APIs, webhooks, authentication flows, polling vs. push architectures, and can evaluate systems based on how well they expose data and support automation, even if you&#39;re not writing the integration code yourself.</li>
</ul>
<ul>
<li>Can work independently with minimal guidance, taking ownership of complex problems from design through implementation while managing ambiguity inherent in early-stage programs.</li>
</ul>
<ul>
<li>Have strong analytical and problem-solving skills, with the ability to break down complex problems into manageable parts and develop creative solutions.</li>
</ul>
<ul>
<li>Are able to communicate complex technical ideas to both technical and non-technical stakeholders, with a strong focus on collaboration and teamwork.</li>
</ul>
<ul>
<li>Are passionate about staying up-to-date with industry trends and emerging technologies, with a willingness to learn and adapt to new tools and techniques.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>GRC, Automation, Data Pipelines, System Integrations, APIs, Integration Patterns, REST APIs, Webhooks, Authentication Flows, Polling vs. Push Architectures, Data Integration, Data Normalisation, Data Joining, Data Modelling, Data Analysis, Data Visualisation, Agile Methodologies, Scrum, Kanban, Continuous Integration, Continuous Deployment, Continuous Monitoring, Cloud Infrastructure, Identity Providers, HR Platforms, Ticketing Tools, Code Repositories, Version Control, CI/CD Pipelines, GRC Tools, Policy-as-Code, Automated Checks, Continuous Validation, Feedback Loops, Policy Implementation, Technical Controls, Policy Evolution, Infrastructure Realities, Compliance Requirements, Engineer Communication, Technical Ideas, Collaboration, Teamwork, Industry Trends, Emerging Technologies, Learning, Adaptation, New Tools, New Techniques, Python, Rust, Go, Java, C++, JavaScript, TypeScript, SQL, NoSQL, Cloud Computing, DevOps, Security, Compliance, Risk Management, Audit Programs, SOC 2, ISO, HIPAA, FedRAMP, GRC Platforms, GRC Tools, Policy Management, Compliance Management, Risk Management, Audit Management, Compliance Automation, GRC Automation, Policy Automation, Compliance Orchestration, Risk Orchestration, Audit Orchestration, Compliance Intelligence, Risk Intelligence, Audit Intelligence, Compliance Analytics, Risk Analytics, Audit Analytics, Compliance Reporting, Risk Reporting, Audit Reporting, Compliance Dashboarding, Risk Dashboarding, Audit Dashboarding</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic is a technology company that aims to create reliable, interpretable, and steerable AI systems. It has a quickly growing team of researchers, engineers, policy experts, and business leaders.</Employerdescription>
      <Employerwebsite>https://job-boards.greenhouse.io</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/4980335008</Applyto>
      <Location>San Francisco, CA | New York City, NY | Seattle, WA</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>aa015612-5ff</externalid>
      <Title>Product &amp; Solutions Lead, Safety and Security</Title>
      <Description><![CDATA[<p><strong>Job Posting</strong></p>
<p><strong>Product &amp; Solutions Lead, Safety and Security</strong></p>
<p><strong>Location</strong></p>
<p>San Francisco</p>
<p><strong>Employment Type</strong></p>
<p>Full time</p>
<p><strong>Department</strong></p>
<p>Intelligence &amp; Investigations</p>
<p><strong>Compensation</strong></p>
<ul>
<li>$288K – $425K • Offers Equity</li>
</ul>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
</ul>
<ul>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
</ul>
<ul>
<li>401(k) retirement plan with employer match</li>
</ul>
<ul>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
</ul>
<ul>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
</ul>
<ul>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
</ul>
<ul>
<li>Mental health and wellness support</li>
</ul>
<ul>
<li>Employer-paid basic life and disability coverage</li>
</ul>
<ul>
<li>Annual learning and development stipend to fuel your professional growth</li>
</ul>
<ul>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
</ul>
<ul>
<li>Relocation support for eligible employees</li>
</ul>
<ul>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p>More details about our benefits are available to candidates during the hiring process.</p>
<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>
<p><strong>About the Team</strong></p>
<p>The Intelligence &amp; Investigations (I2) team detects and disrupts abuse and strategic risks so people can use AI safely. We translate real-world signals, investigations, and external threat intelligence into practical mitigations, operating guidance, and partner-ready support that improves safety outcomes across the AI ecosystem.</p>
<p><strong>About the Role</strong></p>
<p>As a Product &amp; Solutions Lead focused on safety and security, you will build and operate 0–1 products, services, and technical solution packages that help developers and public institutions move from experimentation to durable, trusted outcomes—while maintaining public safety, transparency, and respect for privacy and rights.</p>
<p>This role balances two modes of delivery:</p>
<ol>
<li>Bespoke products and technical solutions for strategic internal and external partners, and</li>
</ol>
<ol>
<li>Scalable product and solution packages that can be reused broadly across partners and deployments.</li>
</ol>
<p>Training is a component of scale, but not the center of gravity. You will also ship reference implementations, playbooks, evaluation kits, and repeatable operating models that partners can adopt and operate.</p>
<p>You will work directly with engineers and a multidisciplinary group of safety and geopolitical analysts, and data and quantitative scientists to convert complex, evolving challenges into solutions that teams can adopt in high-stakes environments.</p>
<p>This role is based in San Francisco, CA (hybrid, 3 days/week). Relocation support is available.</p>
<p><strong>In this role, you will:</strong></p>
<ul>
<li>Own the 0–1 roadmap for safety and security solution offerings: define the target users, problem statements, tools, operating models, success metrics, and the set of reusable deliverables we ship.</li>
</ul>
<ul>
<li>Design and ship bespoke technical solutions for priority partners (internal and external), then abstract what works into reusable patterns and toolkits.</li>
</ul>
<ul>
<li>Build partner-ready technical artifacts: solution blueprints, reference architectures, evaluation and monitoring guidance, incident/response playbooks, and deployment checklists.</li>
</ul>
<ul>
<li>Package open-source and proprietary capabilities into adoption-ready solutions (e.g., reference implementations, configuration patterns, validated workflows).</li>
</ul>
<ul>
<li>Maintain a consistent delivery model across engagements: intake, scoping, governance alignment, execution cadence, and retrospectives that improve the offering over time.</li>
</ul>
<ul>
<li>Translate evolving threats into actionable guidance and updates for solution packages (e.g., scams/fraud patterns, cyber-enabled threats, ecosystem abuse trends).</li>
</ul>
<ul>
<li>Develop lightweight enablement components as needed: targeted technical modules, hands-on labs, and readiness assessments that accelerate adoption of the solutions.</li>
</ul>
<ul>
<li>Define and instrument impact measurement: adoption milestones, readiness indicators, reliability and safety posture improvements, and partner satisfaction with outputs.</li>
</ul>
<ul>
<li>Partner closely across engineering, safety, geopolitical analysis, and quantitative teams to ensure solutions are technically credible, threat-informed, and measurable.</li>
</ul>
<ul>
<li>Communicate crisply and decision-readily to internal and external stakeholders: progress, trade-offs, risks, and recommendations.</li>
</ul>
<p><strong>You might thrive in this role if you:</strong></p>
<ul>
<li>Have 6+ years in product, technical program leadership, solutions, or platform operations, especially in safety, security, risk, integrity, or enterprise/public-sector contexts.</li>
</ul>
<ul>
<li>Have built 0–1 solution offerings (product plus services or productized services): taking ambiguous needs, shipping something concrete, then scaling it into a repeatable model.</li>
</ul>
<ul>
<li>Have a builder’s mindset: comfortable incubating early-stage ideas, testing them with partners, and evolving them into durable, repeatable safety and security solutions.</li>
</ul>
<ul>
<li>Can go deep with engineers and still produce partner-ready artifacts that are clear</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$288K – $425K</Salaryrange>
      <Skills>product leadership, technical program leadership, solutions, platform operations, safety, security, risk, integrity, enterprise/public-sector contexts, product development, solution development, technical writing, communication, project management, team leadership, collaboration, problem-solving, analytical skills, data analysis, data visualization, machine learning, artificial intelligence, cybersecurity, threat intelligence, incident response, compliance, regulatory affairs, cloud computing, containerization, DevOps, agile development, scrum, kanban, continuous integration, continuous deployment, continuous testing, test automation, security testing, penetration testing, vulnerability assessment, compliance testing, regulatory testing, data protection, information security, cybersecurity frameworks, risk management, compliance management, regulatory compliance, data governance, information governance, data quality, data integrity, data validation, data verification, data certification, data assurance, data security, data encryption, data masking, data tokenization, data anonymization, data pseudonymization, data aggregation, data fusion, data integration, data warehousing, data mart, data lake, data catalog, data governance, data quality, data integrity, data validation, data verification, data certification, data assurance, data security, data encryption, data masking, data tokenization, data anonymization, data pseudonymization, data aggregation, data fusion, data integration, data warehousing, data mart, data lake, data catalog</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is a technology company that focuses on developing and applying artificial intelligence in a way that benefits humanity. It was founded in 2015 and has since grown to become one of the leading AI research and development companies in the world.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/c664cc09-d996-450c-8683-ad591ac27c11</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>e2b12c01-dfb</externalid>
      <Title>Founding Technical Pre-Sales Engineer</Title>
      <Description><![CDATA[<p><strong>Build AI automations that America&#39;s largest real estate firms actually use to make billion-dollar decisions.</strong></p>
<p><strong><strong>About Us:</strong></strong></p>
<p>Fifth Dimension is building _the_ all-in-one platform for commercial real estate, vastly outperforming horizontal AI players on complex tasks requiring deep enterprise and institutional knowledge. We have an established international customer base of Enterprise CRE companies, strong traction across US and Europe as well as some customers in APAC</p>
<p>We’re a hypergrowth early-stage startup with ample runway, backed by Tier 1 European and American investors - in the last year we’ve seen 7x ARR growth and 13x’d product usage. We currently have offices in London and New York, and a Singapore office opening in Q1 2026 - of which you’ll be a founding member.</p>
<p>The founders, Johnny Morris and Dr. Kate Jarvis, are both data nerds But that&#39;s not all! Johnny&#39;s got 16 years&#39; experience applying data and analytics to Real Estate, our GTM niche, and Kate&#39;s got a PhD in Linguistics from Stanford and 12 years&#39; CPTOing across Silicon Valley and London startups. We&#39;re demanding, excitable, encouraging of experimentation, adoring of bad puns, full of smart and autonomous folks, and have built our business on a foundation of compassion and kind challenge.</p>
<p>Sound like your cup of tea (or coffee) ?</p>
<p><strong><strong>Role Overview:</strong></strong></p>
<p>As one of 2 hires for the new Singapore HQ, you have the opportunity to shape an entirely new geo with the biggest players in Commercial Real Estate. You&#39;ll be the critical bridge between our innovative AI solutions and prospective customers&#39; business challenges, as well as developing the insights to help us tailor our product to the region.</p>
<p>Day-to-day you&#39;ll work alongside the incoming Head of Sales and EU staff seconded to Singapore to win complex deals, as well as collaborating with UK &amp; US colleagues through largely async tools and processes. You&#39;ll operate with real autonomy: we need someone who can handle high stakes technical conversations, develop POCs, deliver demos, and make decisions - without waiting for the rest of the company to wake up</p>
<p>This is a high-ownership role for someone who&#39;s energised by building from the ground up. You’ll be pivotal in expanding our APAC customer base - shaping how we sell, demo, and position in this region. And as we scale, you&#39;ll have the opportunity to build out the technical sales function you&#39;re establishing. This is your chance to be foundational to a region for a company with real traction and the ambition to become the most valuable platform in commercial real estate.</p>
<p><strong><strong>Key Responsibilities:</strong></strong></p>
<ul>
<li><strong>Technical Discovery &amp; Evaluation:</strong> Run in-depth technical discovery meetings with senior customer stakeholders (up to Executive level) to understand prospect needs, pain points, and technical requirements. Evaluate the ROI of building new data pipelines.</li>
</ul>
<ul>
<li><strong>Solution Design:</strong> Architect and present custom solutions that align our AI capabilities with client business objectives. Run API mapping and hand over clean specs to engineering post-sale.</li>
</ul>
<ul>
<li><strong>Demo Excellence:</strong> Create and deliver compelling product demonstrations and Proof of Concepts that showcase the value of our platform in relevant, customer-specific scenarios - using our platform, and Claude Code/Open AI codex etc</li>
</ul>
<ul>
<li><strong>Technical Objection Handling:</strong> Address and overcome technical objections with clear, concise explanations that build prospect confidence.</li>
</ul>
<ul>
<li><strong>Cross-Functional Collaboration:</strong> Work closely with Sales, Solutions, and Engineering teams to share customer insights, influence product direction, and ensure implementation success.</li>
</ul>
<ul>
<li><strong>APAC Product Voice:</strong> Be opinionated on how our product needs customisation for APAC markets, communicate clearly to Engineering, influence the roadmap to ensure we&#39;re building what the region actually needs.</li>
</ul>
<p>You&#39;ll travel ~25% of the time within APAC for customer meetings and conferences, plus quarterly trips to the UK HQ/other international location for all-company gatherings.</p>
<p><strong><strong>What We&#39;re Looking For:</strong></strong></p>
<p><strong>You&#39;ve thrived in an early-stage high-growth VC-backed startup, or in a similar role at a scaleup (think: being the first to sell and build a team around a shiny new product or geo). This isn&#39;t just preferred - it&#39;s required.</strong></p>
<p><strong>Must</strong></p>
<ul>
<li><strong>Pre-Sales Experience:</strong> 8+ years in Enterprise B2B SAAS pre-sales engineering, solutions consulting, or technical sales; for a technical product</li>
</ul>
<ul>
<li><strong>Industry Knowledge:</strong> Worked in Real Estate / Proptech</li>
</ul>
<ul>
<li><strong>Technical Acumen:</strong> Understanding of APIs, data integration, AI/ML concepts, and system architecture</li>
</ul>
<ul>
<li><strong>Product-minded:</strong> You can evaluate build vs. buy decisions, push back on requirements that aren&#39;t feasible (to the point of killing a deal), and articulate what the market actually needs versus what customers say they want. You&#39;ve worked closely with engineering teams before - not just handing off requests, but shaping what gets built.</li>
</ul>
<ul>
<li><strong>AI-native:</strong>You can’t go a day without using Claude Code / Open AI codex etc - you can talk for hours about how you’ve integrated these tools into your day-to-day, and what you’ve built</li>
</ul>
<ul>
<li><strong>Great Communicator:</strong> Exceptional presentation abilities, with experience leading technical discussions at various levels of an organization. Excellent async communicator - the majority of our company is in a timezone 8+ hours behind Singapore.</li>
</ul>
<ul>
<li><strong>Excellent Relationship Builder:</strong> You establish trust through technical credibility and genuine interest in customer success.</li>
</ul>
<ul>
<li><strong>Work Authorization:</strong>Must be a Singapore Citizen, Permanent Resident, or hold a valid work pass</li>
</ul>
<ul>
<li><strong>Hybrid Onsite Availability:</strong>Must be open to a hybrid work schedule (2+ days per week in office)</li>
</ul>
<p><strong>Nice to have</strong></p>
<ul>
<li>Software engineering experience</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>Competitive salary and equity</Salaryrange>
      <Skills>Pre-Sales Experience, Industry Knowledge: Real Estate / Proptech, Technical Acumen: APIs, data integration, AI/ML concepts, system architecture, Product-minded: build vs. buy decisions, push back on requirements, articulate market needs, AI-native: Claude Code / Open AI codex, Great Communicator: presentation abilities, technical discussions, async communicator, Excellent Relationship Builder: trust through technical credibility, customer success, Software engineering experience</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Fifth Dimension</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.fifthdimensionai.com.png</Employerlogo>
      <Employerdescription>Fifth Dimension is building an all-in-one platform for commercial real estate, vastly outperforming horizontal AI players on complex tasks requiring deep enterprise and institutional knowledge.</Employerdescription>
      <Employerwebsite>https://careers.fifthdimensionai.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.fifthdimensionai.com/jobs/7030405-founding-technical-pre-sales-engineer</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>1106927f-e3c</externalid>
      <Title>Data Engineer III</Title>
      <Description><![CDATA[<p>As a Data Engineer III at EADP Gameplay Services, you will plan, build, and deploy enterprise integration and business intelligence solutions to support and enhance matchmaking services. You will serve as a subject matter expert in developing data integration solutions with modern ETL technologies, leveraging advanced tools to store and analyse large-scale data and address complex business challenges.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Analyse and organise raw matchmaking data; model datasets (3NF, dimensional) for analytics and products.</li>
<li>Build and operate reliable data pipelines (batch and streaming) across multiple sources at petabyte scale.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>6-8 years designing, developing, and managing data systems (warehouses, lakes, and distributed stores) at multi-PB scale.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data integration, ETL technologies, data analysis, real-time pipelines, data quality frameworks, basic statistical modelling</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Data-Engineer-III/212231</Applyto>
      <Location>Hyderabad</Location>
      <Country></Country>
      <Postedate>2026-02-17</Postedate>
    </job>
    <job>
      <externalid>9db29330-a94</externalid>
      <Title>Data Engineer III</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Data Engineer III to join our team. As a Data Engineer III, you will be responsible for designing, building, and optimizing scalable data acquisition, transformation, and integration pipelines to meet evolving business requirements.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Design, build, and optimize scalable data acquisition, transformation, and integration pipelines to meet evolving business requirements.</li>
<li>Develop, maintain, and automate ETL/ELT processes to extract, transform, and load data from diverse sources into analytical and operational systems with high accuracy and timeliness.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>7+ years of experience in data integration, data engineering, or related roles, with a proven track record of delivering scalable data solutions.</li>
<li>Advanced proficiency in Snowflake, including its advanced features and utilities.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data integration, data engineering, Snowflake, ETL/ELT processes, data modeling</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts Inc.</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Data-Engineer-III/211255</Applyto>
      <Location>Hyderabad</Location>
      <Country></Country>
      <Postedate>2026-02-17</Postedate>
    </job>
    <job>
      <externalid>0d8ffba2-706</externalid>
      <Title>Praktikum digitale Transformation &amp; Management Finanzsysteme</Title>
      <Description><![CDATA[<p><strong>What you&#39;ll do</strong></p>
<p>You&#39;ll be part of our team that shapes the future of financial processes at Porsche actively. Whether it&#39;s networked automation or intelligent data integration, we think finance anew.</p>
<ul>
<li>Your focal points are (Agile) projects for the digital transformation in the finance department</li>
<li>You&#39;ll be responsible for the maintenance and optimization of financial systems and processes at Porsche</li>
<li>You&#39;ll ensure the balance sheet and procedural representation of new digital services and products at Porsche</li>
<li>You&#39;ll create process documents and compliance guidelines</li>
<li>You&#39;ll present, analyze, and brief management</li>
<li>You&#39;ll handle administrative tasks and our daily business</li>
<li>You&#39;ll perform business economic analyses</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>You have a place of study in business science, engineering, computer science, or a comparable course of study (at least in the 3rd semester or in the gap year between Bachelor and Master)</li>
<li>Ideally, you have some practical experience</li>
<li>You have a secure handling of MS Office</li>
<li>You&#39;re not afraid of IT systems</li>
<li>You have developed analytical and creative skills as well as team and communication skills</li>
</ul>
<p><strong>Why this matters</strong></p>
<p>This role keeps a world-championship-winning F1 team running. When equipment fails, races can be lost, so your work directly impacts performance. You&#39;ll develop deep expertise in high-spec facilities and have clear progression into senior facilities management roles. The F1 environment means you&#39;ll work with cutting-edge building systems and learn from the best in the industry.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>digital transformation, financial systems, process optimization, balance sheet management, compliance guidelines, MS Office, IT systems, Agile project management, financial analysis, data integration</Skills>
      <Category>Finance</Category>
      <Industry>Automotive</Industry>
      <Employername>Dr. Ing. h.c. F. Porsche AG</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.porsche.com.png</Employerlogo>
      <Employerdescription>Porsche is a valuable brand with worldwide appeal and a loyal customer base around the globe. The way we work together and hold together as a team is unique. Our Miteinander is shaped by our strong Porsche culture: Herzblut | Sportlichkeit | Pioniergeist | Eine Familie</Employerdescription>
      <Employerwebsite>https://jobs.porsche.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.porsche.com/index.php?ac=jobad&amp;id=17975</Applyto>
      <Location>Stuttgart-Zuffenhausen</Location>
      <Country></Country>
      <Postedate>2025-12-08</Postedate>
    </job>
  </jobs>
</source>