{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/data-visualisation"},"x-facet":{"type":"skill","slug":"data-visualisation","display":"Data Visualisation","count":39},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_fb54f45e-94a"},"title":"Praktikum im Bereich Arbeitssicherheit mit Schwerpunkt Digitalisierung","description":"<p>As a member of our team, you will play a key role in supporting the implementation of digitalisation projects, focusing on the Microsoft Power Platform, Microsoft Fabric, and potentially Celonis and KNIME. You will also contribute to the development of AI agents using Microsoft Copilot. Your tasks will include creating and preparing analyses in the form of digital dashboards and reports, as well as participating in the creation of presentation, documentation, and training materials.</p>\n<p>The internship will start in June and last for 5-6 months. You will have high levels of autonomy and flexibility in your work, with opportunities to gain insights into various areas of the company. Our active community of interns will provide you with support and guidance throughout your time with us.</p>\n<p>To succeed in this role, you will need to have excellent analytical and problem-solving skills, as well as strong communication and teamwork abilities. You should be proficient in using Microsoft Office applications, particularly Excel, and have experience with data analysis and visualisation tools. Familiarity with programming languages such as Python or R would be an advantage.</p>\n<p>If you are looking for a challenging and rewarding internship opportunity that will help you develop your skills and knowledge in the field of digitalisation, we encourage you to apply.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_fb54f45e-94a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Dr. Ing. h.c. F. Porsche AG","sameAs":"https://jobs.porsche.com","logo":"https://logos.yubhub.co/jobs.porsche.com.png"},"x-apply-url":"https://jobs.porsche.com/index.php?ac=jobad&id=20388","x-work-arrangement":"onsite","x-experience-level":"junior","x-job-type":"internship","x-salary-range":null,"x-skills-required":["Microsoft Power Platform","Microsoft Fabric","Celonis","KNIME","Microsoft Copilot","data analysis","data visualisation","Excel","Python","R"],"x-skills-preferred":[],"datePosted":"2026-04-22T17:31:57.403Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Zuffenhausen"}},"employmentType":"INTERN","occupationalCategory":"Engineering","industry":"Automotive","skills":"Microsoft Power Platform, Microsoft Fabric, Celonis, KNIME, Microsoft Copilot, data analysis, data visualisation, Excel, Python, R"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_dd821a22-5a6"},"title":"Abschlussarbeit (J000020339)","description":"<p>The increased use of lithium-ion batteries in future electric vehicles presents new challenges to systems security, particularly in terms of high energy densities and increased performance demands. A particularly safety-relevant scenario is thermal runaway of individual battery cells. In this process, a highly energetic gas-particle stream is released, characterised by strong transient thermal, mechanical, and abrasive loads. These stresses can cause significant damage to adjacent components and lead to structural failure.</p>\n<p>The goal of this work is to develop a predictive method for evaluating material performance under thermal runaway conditions due to escaping gas and particles at materials used in venting structures. To achieve this, a neural network will be designed, trained, and applied to new materials. The neural network will be trained and validated using existing experimental and simulation data. The generated evaluation data will be compared with results from classical substitute and cell tests to evaluate the performance and reliability of the developed approach. Based on the obtained data, characteristic parameters for evaluating material failure under thermal runaway conditions will be identified and derived.</p>\n<p>In the first step, a systematic literature review will be conducted on existing experimental, analytical, and simulation methods for evaluating fire protection materials in the thermal runaway context. Based on this, a suitable model for evaluating material performance in the context of thermal runaway will be developed, trained, and implemented. The model will be validated using experimental data to assess its predictive accuracy and robustness. Finally, the applicability of the developed approach and potential opportunities for further development will be critically discussed.</p>\n<p>Key tasks:</p>\n<ul>\n<li>Conduct a systematic literature review on existing experimental, analytical, and simulation methods for evaluating fire protection materials in the thermal runaway context.</li>\n</ul>\n<ul>\n<li>Design, train, and implement a neural network for predictive evaluation of material performance under thermal runaway conditions.</li>\n</ul>\n<ul>\n<li>Evaluate and analyse experimental and simulation data to validate the model.</li>\n</ul>\n<ul>\n<li>Compare generated evaluation data with results from classical substitute and cell tests.</li>\n</ul>\n<ul>\n<li>Identify and derive characteristic parameters for evaluating material failure under thermal runaway conditions.</li>\n</ul>\n<ul>\n<li>Critically discuss the applicability of the developed approach and potential opportunities for further development.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s or master&#39;s degree in computer science, mechanical engineering, electrical engineering, or a related field.</li>\n</ul>\n<ul>\n<li>Experience in machine learning and deep learning.</li>\n</ul>\n<ul>\n<li>Familiarity with Python programming language and relevant libraries.</li>\n</ul>\n<ul>\n<li>Good understanding of thermal runaway phenomena and fire protection materials.</li>\n</ul>\n<ul>\n<li>Excellent communication and teamwork skills.</li>\n</ul>\n<ul>\n<li>Ability to work independently and manage multiple tasks.</li>\n</ul>\n<p>Preferred skills:</p>\n<ul>\n<li>Experience with neural networks and deep learning frameworks such as TensorFlow or PyTorch.</li>\n</ul>\n<ul>\n<li>Familiarity with simulation software such as ANSYS or COMSOL.</li>\n</ul>\n<ul>\n<li>Knowledge of thermal analysis and heat transfer.</li>\n</ul>\n<ul>\n<li>Experience with data analysis and visualisation tools such as Matplotlib or Seaborn.</li>\n</ul>\n<ul>\n<li>Familiarity with version control systems such as Git.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_dd821a22-5a6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Dr. Ing. h.c. F. Porsche AG","sameAs":"https://jobs.porsche.com","logo":"https://logos.yubhub.co/jobs.porsche.com.png"},"x-apply-url":"https://jobs.porsche.com/index.php?ac=jobad&id=20339","x-work-arrangement":"onsite","x-experience-level":"entry","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","Machine learning","Deep learning","Neural networks","Data analysis","Data visualisation","Simulation software","Thermal analysis","Heat transfer"],"x-skills-preferred":["TensorFlow","PyTorch","ANSYS","COMSOL","Matplotlib","Seaborn","Git"],"datePosted":"2026-04-22T17:28:20.238Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Weissach"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Automotive","skills":"Python, Machine learning, Deep learning, Neural networks, Data analysis, Data visualisation, Simulation software, Thermal analysis, Heat transfer, TensorFlow, PyTorch, ANSYS, COMSOL, Matplotlib, Seaborn, Git"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6690b2fa-cab"},"title":"(Senior) Team Lead Data Analytics (all genders)","description":"<p>At Holidu, data isn&#39;t just a support function, it&#39;s how we make decisions. The Analytics team builds the products and foundations that keep the whole organisation sharp, from day-to-day operations to long-term strategy.</p>\n<p>This role is on-site in Munich, with two office days per week.</p>\n<p>As a Senior Team Lead Data Analytics, you will lead one of Holidu&#39;s core analytics teams, a function at the intersection of data, strategy, and real business impact. The team has four direct reports and entails collaborating cross-functionally with data engineers and data scientists.</p>\n<p>Engage with senior leadership on strategic projects, providing insights that influence product strategy, internal operations, and revenue growth.</p>\n<p>You and your team will support a range of stakeholders across the company (e.g. Customer Support, Host Experience, Sales and Account Management).</p>\n<p>As a member of the BI leadership team, you will help shape the department strategy and the future of AI-powered data products.</p>\n<p>Understand problems and identify opportunities across a diverse range of stakeholder use cases, translating them into analytical requirements and communicating complex findings clearly to both technical and commercial audiences.</p>\n<p>Lead from the front: this role carries meaningful individual contributor responsibility. You&#39;ll be expected to do real analytical work, diving deep into the data, building solutions, and setting the bar for quality in your team.</p>\n<p>Shape the future of analytics at Holidu by recruiting top talent, setting clear goals, and developing your team personally and professionally.</p>\n<p>The ideal candidate will have 5+ years of data analytics experience, people management experience, a collaborative mindset, a mission-driven mentality, excellent analytical and technical skills, and a genuine commitment to AI enablement.</p>\n<p>Impact: Shape the future of travel with products used by millions of guests and thousands of hosts. At Holidu ideas become products, data drives decisions, and iteration fuels fast learning. Your work matters - and you’ll see the impact.</p>\n<p>Learning: Grow professionally in a culture that thrives on curiosity and feedback. You’ll learn from outstanding colleagues, collaborate across disciplines, and benefit from mentorship, and personal learning budgets - with a strong focus on AI.</p>\n<p>Great People: Join a team of smart, motivated and international colleagues who challenge and support each other. We celebrate wins and keep our culture fun, ambitious and human. Our customers are guests and hosts - people we can all relate to - making work meaningful and energizing.</p>\n<p>Technology: Work in a modern tech environment. You’ll experience the pace of a scale-up combined with the stability of a proven business model, enabling you to build, test, and improve continuously.</p>\n<p>Flexibility: Work a hybrid setup with 50% in-office time for collaboration, and spend up to 8 weeks a year from other inspiring locations. You’ll stay connected through regular events and meet-ups across our almost 30 offices.</p>\n<p>Perks on Top: Of course, we also offer travel benefits, gym discounts, and other perks to keep you energized - but what truly sets us apart is the chance to grow in a dynamic industry, alongside amazing people, while having fun along the way.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6690b2fa-cab","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Holidu Hosts GmbH","sameAs":"https://holidu.jobs.personio.com","logo":"https://logos.yubhub.co/holidu.jobs.personio.com.png"},"x-apply-url":"https://holidu.jobs.personio.com/job/2598226","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"Full-time","x-salary-range":null,"x-skills-required":["Database: AWS Stack (Redshift, Athena, Glue, S3)","Data Pipelines: Airflow, dbt","Data Visualisation: Looker","Data Analytics: SQL, Python","Collaboration: Git, Jira, Confluence, Slack"],"x-skills-preferred":[],"datePosted":"2026-04-18T22:13:28.264Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Munich, Germany"}},"employmentType":"FULL_TIME","occupationalCategory":"Technology","industry":"Travel Technology","skills":"Database: AWS Stack (Redshift, Athena, Glue, S3), Data Pipelines: Airflow, dbt, Data Visualisation: Looker, Data Analytics: SQL, Python, Collaboration: Git, Jira, Confluence, Slack"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_50fb3460-886"},"title":"Research Operations, Economic Research","description":"<p>We are seeking a strong operator to serve as the operational backbone of our Economic Research team. Your overarching mandate is twofold: 1) clear the brush so our researchers can efficiently execute on key research priorities, and 2) ensure research aspirations and findings translate into concrete policy and commercial outcomes.</p>\n<p>In this role, you will drive execution on the team&#39;s research initiatives, such as economic index releases, economic briefs and working papers, agents research, Clio observability analyses, and Anthropic interviewer deployments. You will also work backward from business deadlines to set research deadlines/milestones and plan out team capacity.</p>\n<p>Key responsibilities include ensuring smooth operating cadence and relationship touchpoints with our research ecosystem, partnering with colleagues on Research, Public Policy, Communications, and GTM to organise briefings and campaigns that amplify key research findings, and supporting the team lead in setting quarterly research priorities and resource allocation.</p>\n<p>We are looking for someone with a 5+ year track record managing and delivering on business-critical, cross-functional initiatives, who can quickly gain state on technical domains and navigate organisational complexity, building relationships and influencing without authority.</p>\n<p>Sample projects include building a living tracker for critical research workstreams, managing contracting and vendor relationships for commissioned research projects, and coordinating cross-functional alignment for a major academic research partnership.</p>\n<p>The annual compensation range for this role is $180,000-$210,000 USD.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_50fb3460-886","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.com/","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5154112008","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$180,000-$210,000 USD","x-skills-required":["project management","research operations","cross-functional collaboration","data analysis","communication skills"],"x-skills-preferred":["AI research","economic research","policy analysis","data visualisation","public speaking"],"datePosted":"2026-04-18T16:00:30.078Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Research","industry":"Technology","skills":"project management, research operations, cross-functional collaboration, data analysis, communication skills, AI research, economic research, policy analysis, data visualisation, public speaking","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":210000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4ea1ac8b-8e1"},"title":"Digital Success Intern (Summer 2026)","description":"<p>Secure Every Identity, from AI to Human Identity is the key to unlocking the potential of AI. Okta secures AI by building the trusted, neutral infrastructure that enables organisations to safely embrace this new era.</p>\n<p>This work requires a relentless drive to solve complex challenges with real-world stakes. We are looking for builders and owners who operate with speed and urgency and execute with excellence. This is an opportunity to do career-defining work. We&#39;re all in on this mission. If you are too, let&#39;s talk.</p>\n<p>About the Internship: The Digital Success team helps Okta customers of all sizes adopt our platform and maximise their investment. We bridge the gap between customer pain points and tangible value, guiding companies from onboarding to a successful renewal. As a Digital Success Intern focusing on Automated Field Insights, you will play a pivotal role in our ability to scale value realisation by leveraging cutting-edge automation and AI to deliver insights at scale.</p>\n<p>Throughout your internship, you will contribute to high-impact projects leveraged by our customer-facing teams, directly influencing how our customers perceive value.</p>\n<p>What You’ll Get to Do:</p>\n<ul>\n<li>Content Audit &amp; Optimisation: Lead the end-to-end audit of our automated presentation decks. You’ll ensure content is current, on-brand, and consistent across our decks and Highspot, collaborating cross-functionally to maintain a &#39;single source of truth&#39; while identifying ways to better use AI to maintain content over time.</li>\n</ul>\n<ul>\n<li>Customer Sentiment &amp; Gong Analysis: Dive deep into Gong data to analyse customer calls. You will identify emerging business goals (specifically around AI adoption), uncover recurring themes, and compare these insights against our current Matik content to recommend high-impact updates.</li>\n</ul>\n<ul>\n<li>AI Functionality Deep-Dive: Act as the team’s &#39;AI Scout&#39; by exploring Platform AI capabilities. You will evaluate and build internal use cases for AI Insights, AI Query Builder, and email to streamline our digital success motion.</li>\n</ul>\n<ul>\n<li>Strategic Execution: Identify iterative improvements for existing programs based on product usage data and feedback from the broader Customer Success organisation.</li>\n</ul>\n<ul>\n<li>Stakeholder Presentation: Conclude your internship by presenting your findings, AI recommendations, and project impact to Okta leadership and key stakeholders.</li>\n</ul>\n<p>Who We are Looking For:</p>\n<ul>\n<li>Education: Rising senior pursuing a Bachelor’s degree or higher (Business, Marketing, Data Science, or Communications preferred).</li>\n</ul>\n<ul>\n<li>Graduating Dec 2026 or Spring 2027</li>\n</ul>\n<ul>\n<li>Analytical Mindset: You enjoy &#39;connecting the dots&#39; between raw data and customer stories, and have a high level of attention to detail.</li>\n</ul>\n<ul>\n<li>Technical Curiosity: A genuine interest in Identity Security and the intersection of AI and Customer Success.</li>\n</ul>\n<ul>\n<li>Collaboration: Proven ability to work across teams like Marketing, Product, and Operations.</li>\n</ul>\n<ul>\n<li>Communication: Strong copywriting skills with an eye for design consistency and brand voice.</li>\n</ul>\n<ul>\n<li>Bonus Points: Familiarity with Data Visualisation and Customer Success Software (e.g. Matik, Gong, Gainsight, Pendo, or Tableau).</li>\n</ul>\n<p>Okta’s Intern Program</p>\n<p>As an intern, you’ll do real work that matters. While you’re on board, you’ll work on meaningful projects and have an opportunity to see what working at Okta is all about. You’ll also have the support of your mentor and manager to help you develop new skills.</p>\n<p>Our interns have the opportunity to build a strong community - with their fellow interns, within their teams, and with the broader company. We want you to grow professionally and you’ll do that through participating in events like our Executive Speaker Series and Brown Bags.</p>\n<p>And of course, we want you to have fun, too. Our internship program includes exciting opportunities to connect with your cohort beyond the office through fun and classic local outings.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4ea1ac8b-8e1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7791786","x-work-arrangement":"hybrid","x-experience-level":"internship","x-job-type":"internship","x-salary-range":null,"x-skills-required":["Content Audit & Optimisation","Customer Sentiment & Gong Analysis","AI Functionality Deep-Dive","Strategic Execution","Stakeholder Presentation"],"x-skills-preferred":["Data Visualisation","Customer Success Software","Matik","Gong","Gainsight","Pendo","Tableau"],"datePosted":"2026-04-18T15:57:39.949Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bellevue, Washington"}},"employmentType":"INTERN","occupationalCategory":"Engineering","industry":"Technology","skills":"Content Audit & Optimisation, Customer Sentiment & Gong Analysis, AI Functionality Deep-Dive, Strategic Execution, Stakeholder Presentation, Data Visualisation, Customer Success Software, Matik, Gong, Gainsight, Pendo, Tableau"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_77050838-92f"},"title":"Senior Director, Data Analytics","description":"<p>As the Senior Director, Data Analytics, you&#39;ll be the strategic analytics leader for Marketing and Product. You&#39;ll lead a newly combined organisation that brings together data-driven insights across the customer lifecycle, from acquisition through adoption and expansion.</p>\n<p>Reporting to the Vice President, Enterprise Data, you&#39;ll partner closely with senior leaders to improve how teams make decisions, measure performance, and drive outcomes, with a focus on shared views of usage and consumption models and major product launches.</p>\n<p>You&#39;ll oversee two critical functions: Marketing Analytics (including demand generation, lifecycle marketing, brand, web, developer relations, localization, monetization, and campaign and event effectiveness) and Product Data Insights (including DevOps, Security, Platforms, AI products, new usage and consumption models, product adoption, feature usage, and customer behaviour analysis).</p>\n<p>Key responsibilities include:</p>\n<p>Defining and executing a unified analytics strategy across Marketing and Product, including shared metrics, measurement frameworks, and dashboards that serve as a single source of truth and connect marketing investment to product adoption, usage, and customer outcomes.</p>\n<p>Building clear operating rhythms and ways of working that close historical gaps and help Marketing and Product make consistent, data-informed decisions.</p>\n<p>Partnering with Marketing and Product leadership, including the Chief Product &amp; Marketing Officer and other senior stakeholders, to provide actionable insights and executive-ready recommendations that shape go-to-market plans, product launches, roadmap prioritisation, and user experience improvements.</p>\n<p>Solving complex analytics problems across both functions, including attribution modelling, lead scoring optimisation, campaign and event effectiveness, product adoption and feature usage analysis, and analytics for AI-powered features (including instrumentation, usage, and cost drivers).</p>\n<p>Building and maintaining forecasting and scenario modelling frameworks in partnership with Finance, Product, and go-to-market leaders, tying pipeline, recurring revenue, and usage or consumption models to planning and investment decisions.</p>\n<p>Establishing and scaling an experimentation programme across Marketing and Product, setting standards for hypothesis design, test methodology, instrumentation requirements, and clear readouts that translate results into decisions.</p>\n<p>Building strong partnerships with data engineering, engineering, and legal, privacy, and security teams to translate business questions into technical requirements, prioritise telemetry and data model work, and improve reliability, quality, accessibility, and compliance in the analytics stack.</p>\n<p>Hiring, mentoring, and developing leaders and team members, raising the bar for strategic thinking, stakeholder partnership, and end-to-end ownership across the analytics organisation.</p>\n<p>Requirements include:</p>\n<p>Strategic analytics leadership across both marketing analytics and product analytics, ideally in B2B SaaS or other high-growth technology environments.</p>\n<p>Experience building, leading, and developing multi-layer analytics teams (including hiring, managing managers, and coaching leaders).</p>\n<p>Ability to define and operationalise end-to-end measurement frameworks across Marketing and Product, including shared KPIs and clear metric definitions.</p>\n<p>Strong analytical and technical skills, including SQL; statistical analysis and experimentation (A/B and multivariate testing); forecasting and scenario modelling; and advanced analytics techniques.</p>\n<p>Experience with data visualisation and BI tools (Tableau or similar), with a track record of building executive-ready reporting and narratives for senior leaders.</p>\n<p>Proven partnership with Engineering and Data Engineering to translate business needs into telemetry, data models, and analytics requirements, and to improve reliability and delivery across the analytics stack.</p>\n<p>Experience collaborating with Legal, Privacy, and Security partners to design compliant telemetry and data-collection approaches that respect regulations and customer expectations.</p>\n<p>Ability to influence senior leaders through clear communication and actionable recommendations, and to work effectively in a fully remote, asynchronous environment while adopting GitLab&#39;s values and ways of working.</p>\n<p>The base salary range for this role&#39;s listed level is currently $184,400-$314,400 USD for residents of the United States only.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_77050838-92f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"GitLab","sameAs":"https://about.gitlab.com/","logo":"https://logos.yubhub.co/about.gitlab.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/gitlab/jobs/8436589002","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$184,400-$314,400 USD","x-skills-required":["strategic analytics leadership","marketing analytics","product analytics","SQL","statistical analysis","experimentation","forecasting","scenario modelling","data visualisation","BI tools","Tableau","executive-ready reporting","narratives","engineering","data engineering","legal","privacy","security"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:56:55.180Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"strategic analytics leadership, marketing analytics, product analytics, SQL, statistical analysis, experimentation, forecasting, scenario modelling, data visualisation, BI tools, Tableau, executive-ready reporting, narratives, engineering, data engineering, legal, privacy, security","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":184400,"maxValue":314400,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7b76ee44-a05"},"title":"Revenue Operations Manager (Post Sales)","description":"<p>About Dialpad</p>\n<p>Dialpad is the AI-native business communications platform. We unify calling, messaging, meetings, and contact center on a single platform - powered by AI that understands every conversation in real time.</p>\n<p>More than 70,000 companies around the globe, including WeWork, Asana, NASDAQ, AAA Insurance, COMPASS Realty, Uber, Randstad, and Tractor Supply, rely on Dialpad to build stronger customer connections using real-time, AI-driven insights.</p>\n<p>We’re now leading the shift to Agentic AI: intelligent agents that don’t just analyse conversations but take action by automating workflows, resolving customer issues, and accelerating revenue in real time. Our DAART initiative (Dialpad Agentic AI in Real Time) is redefining what a communications platform can do.</p>\n<p>Visit dialpad.com to learn more.</p>\n<p>Being a Dialer</p>\n<p>At Dialpad, AI isn’t just a feature; it’s how our teams do their best work every day. We put powerful AI tools in every employee’s hands so they can move faster, think bigger, and achieve more.</p>\n<p>We believe every conversation matters. And we’ve built the platform that turns those conversations into insight and action, for our customers and ourselves.</p>\n<p>We look for people who are intensely curious and hold themselves to a high bar. Our ambition is significant, and achieving it requires a team that operates at the highest level.</p>\n<p>We seek individuals who embody our core traits: Scrappy, Curious, Optimistic, Persistent, and Empathetic.</p>\n<p>Your role</p>\n<p>The Revenue Operations Manager – Post Sales owns the operational mechanics of Dialpad’s recurring revenue engine. This role is accountable for:</p>\n<ul>\n<li>Gross Revenue Retention (GRR)</li>\n<li>Net Revenue Retention (NRR)</li>\n<li>Renewals forecast accuracy</li>\n<li>Expansion pipeline governance</li>\n<li>Customer Success &amp; Renewals operating cadence</li>\n<li>Land → Expand → Adopt → Renew journey integrity</li>\n<li>Product interlocks &amp; operationalization of new product introductions within the installed base</li>\n</ul>\n<p>This is a revenue ownership role,not a reporting function or a CS business partner role. This position reports to our Director of Business Operations and has the opportunity to be based in our Austin or Tempe offices.</p>\n<p>What you’ll do</p>\n<ul>\n<li>Own operational governance of Gross and Net Revenue Retention.</li>\n<li>Monitor churn, contraction, and expansion drivers.</li>\n<li>Identify structural gaps impacting retention.</li>\n<li>Establish leading indicators for revenue risk.</li>\n<li>Provide executive-level visibility into recurring revenue health.</li>\n<li>Own renewal forecasting methodology and discipline.</li>\n<li>Validate renewal commitments and risk assessments.</li>\n<li>Improve renewal forecast accuracy across segments.</li>\n<li>Establish a structured renewal inspection cadence.</li>\n<li>Design and run operational forums for CS and Renewals.</li>\n<li>Standardize inspection standards across segments.</li>\n<li>Align expansion governance with sales forecasting rigor.</li>\n<li>Ensure consistent pipeline hygiene within post-sales motions.</li>\n<li>Own expansion opportunity visibility and stage discipline.</li>\n<li>Monitor cross-sell/upsell pipeline health.</li>\n<li>Identify leakage within installed accounts.</li>\n<li>Align expansion inspection standards with new logo practices.</li>\n<li>Define and govern operational handoffs across lifecycle stages.</li>\n<li>Ensure adoption signals are visible and measurable.</li>\n<li>Identify friction points in the customer journey.</li>\n<li>Partner cross-functionally to improve retention mechanics.</li>\n<li>Serve as RevOps lead for new product rollouts within the installed base.</li>\n<li>Ensure expansion, attach visibility, and adoption tracking.</li>\n<li>Monitor product adoption metrics impacting retention.</li>\n<li>Provide feedback loops to Product on customer behaviour trends.</li>\n</ul>\n<p>Skills you’ll bring</p>\n<ul>\n<li>6–8+ years in Revenue Operations, CS Operations, or Post-Sales Strategy.</li>\n<li>Experience supporting subscription SaaS retention motions.</li>\n<li>Deep understanding of renewal forecasting and expansion mechanics.</li>\n<li>Strong analytical capability and structured thinking.</li>\n<li>Comfortable influencing CS and executive leadership.</li>\n<li>Ability to operate in high-growth, cross-functional environments.</li>\n<li>Experience with Customer Success platforms (Planhat preferred; Gainsight, Totango, or ChurnZero acceptable).</li>\n<li>Strong understanding of renewal forecasting and GRR/NRR modeling.</li>\n<li>Advanced Excel / Sheets modeling skills; SQL proficiency preferred.</li>\n<li>Deep familiarity with Salesforce opportunity and account data structures.</li>\n<li>Experience integrating CS platforms with CRM systems.</li>\n<li>Ability to translate product usage data into retention insights.</li>\n<li>Strong BI and data visualisation experience.</li>\n</ul>\n<p>Why Join Dialpad</p>\n<p>Work at the centre of the AI transformation in business communications</p>\n<p>Build and ship agentic AI products that are redefining how companies operate</p>\n<p>Join a team where AI amplifies every employee’s impact</p>\n<p>Competitive salary, comprehensive benefits, and real opportunities for growth</p>\n<p>We believe in investing in our people. Dialpad offers competitive benefits and perks, cutting-edge AI tools, and a robust training program that help you reach your full potential.</p>\n<p>We have designed our offices to be inclusive, offering a vibrant environment to cultivate collaboration and connection.</p>\n<p>Our exceptional culture, repeatedly recognised as a Great Place to Work, ensures that every employee feels valued and empowered to contribute to our collective success.</p>\n<p>Don’t meet every single requirement? If you’re excited about this role and possess the fundamental traits, drive, and strong ambition we seek, but your experience doesn’t meet every qualification, we encourage you to apply.</p>\n<p>Dialpad is an equal-opportunity employer. We are dedicated to creating a community of inclusion and an environment free from discrimination or harassment.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7b76ee44-a05","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Dialpad","sameAs":"https://dialpad.com","logo":"https://logos.yubhub.co/dialpad.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dialpad/jobs/8436715002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Revenue Operations","CS Operations","Post-Sales Strategy","Subscription SaaS retention motions","Renewal forecasting","Expansion mechanics","Analytical capability","Structured thinking","Influencing CS and executive leadership","Customer Success platforms","GRR/NRR modeling","Advanced Excel / Sheets modeling skills","SQL proficiency","Salesforce opportunity and account data structures","Integrating CS platforms with CRM systems","Product usage data into retention insights","BI and data visualisation"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:55:04.294Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Austin, US"}},"employmentType":"FULL_TIME","occupationalCategory":"Operations","industry":"Technology","skills":"Revenue Operations, CS Operations, Post-Sales Strategy, Subscription SaaS retention motions, Renewal forecasting, Expansion mechanics, Analytical capability, Structured thinking, Influencing CS and executive leadership, Customer Success platforms, GRR/NRR modeling, Advanced Excel / Sheets modeling skills, SQL proficiency, Salesforce opportunity and account data structures, Integrating CS platforms with CRM systems, Product usage data into retention insights, BI and data visualisation"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e111d755-f4e"},"title":"Senior Solutions Architect - AI/BI","description":"<p>The Solutions Architect (AI/BI) team executes on Databricks&#39; strategic Product Operating Model that provides enhanced focus on earlier stage, highly prioritized product lines in order to establish product market fit, and set the course for rapid revenue growth.</p>\n<p>They are part of a global go-to-market team mandate, though individually will cover a specific, local region. Clients may span across one or more business units and verticals.</p>\n<p>By working in partnership with direct account teams, they will jointly engage clients, foster the necessary relationships, position in-depth the specific product line, so as to provide compelling reasons for clients to adopt and grow the usage of the given product.</p>\n<p>The Solutions Architect (AI/BI) is paired with an Account Executive aligned to a given product line with specific targets accordingly. Together, they will devise and implement a strategy across their assigned set of accounts, develop presentations, demos, and other assets and deliver them such that clients make an informed decision as they decide to adopt the product-line in a meaningful way.</p>\n<p>The AI/BI product-line requires the following core technical competencies:</p>\n<ul>\n<li>Experience in designing and delivering cloud-based Data Visualisation and Analytics Solutions in a client or customer environment</li>\n<li>Ability to advise customers in lakehouse analytics architecture: Prepare Databricks stakeholders for internal conversations and communicate directly, including anticipating blockers and address them before they become an issue</li>\n<li>Certification and/or demonstrated competence in data visualisation and analytics systems along with one of Azure, AWS, or GCP cloud providers</li>\n<li>Demonstrated competence in the Lakehouse architecture including hands-on experience with Apache Spark, Python, and SQL</li>\n</ul>\n<p>The impact you will have:</p>\n<ul>\n<li>Collaborate with GTM leadership and account teams to design and execute high-impact engagement strategies across your territory.</li>\n<li>As a trusted advisor, serve as an expert Solutions Architect and &quot;champion,&quot; building technical credibility with stakeholders to drive product adoption and vision.</li>\n<li>Enable clients at scale through workshops and developing customer-facing collateral that helps increase technical knowledge and thought leadership.</li>\n<li>Influence product roadmap by translating field-derived, data-driven insights into strategic recommendations for Product and Engineering teams</li>\n<li>Handle the most complex technical challenges in this product line by acting as the tier-3 escalation point for the field, ensuring customer success in mission-critical environments.</li>\n</ul>\n<p>Competencies &amp; Responsibilities:</p>\n<ul>\n<li>6+ years in a customer-facing, pre-sales or consulting role influencing technical executives, driving high-level data strategy and product adoption.</li>\n<li>Proven ability to co-plan large territories with Account Executives and operate in a highly coordinated, cross-functional effort across GTM and R&amp;D teams.</li>\n<li>Experience collaborating with Global System Integrators (GSIs) and third-party consulting organizations to drive customer outcomes.</li>\n<li>Proficient in programming, debugging, and problem-solving using SQL and Python.</li>\n<li>Hands-on experience building solutions within major public cloud environments (AWS, Azure, or GCP).</li>\n<li>Broad experience (in two or more) and understanding across the fields of data engineering, data warehousing, AI, ML, governance, transactional systems, app development, and streaming.</li>\n<li>Undergraduate degree (or higher) in a technical field such as Computer Science, Applied Mathematics, Engineering, or similar.</li>\n<li>A track record of driving complex projects to completion in fast-paced, customer-facing environments.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e111d755-f4e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8437289002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Cloud-based Data Visualisation and Analytics Solutions","Lakehouse analytics architecture","Data visualisation and analytics systems","Apache Spark","Python","SQL"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:52:34.674Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Amsterdam, Netherlands"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Cloud-based Data Visualisation and Analytics Solutions, Lakehouse analytics architecture, Data visualisation and analytics systems, Apache Spark, Python, SQL"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_eda5b2b8-a68"},"title":"Senior Solutions Architect - AI/BI","description":"<p>We are seeking a Senior Solutions Architect - AI/BI to join our Field Engineering team in London. The successful candidate will be responsible for executing on Databricks&#39; strategic Product Operating Model, providing enhanced focus on earlier stage, highly prioritized product lines to establish product market fit and set the course for rapid revenue growth.</p>\n<p>As a Senior Solutions Architect - AI/BI, you will work in partnership with direct account teams to jointly engage clients, foster necessary relationships, position in-depth the specific product line, and provide compelling reasons for clients to adopt and grow the usage of the given product.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Collaborating with GTM leadership and account teams to design and execute high-impact engagement strategies across your territory.</li>\n<li>Serving as a trusted advisor, expert Solutions Architect, and champion, building technical credibility with stakeholders to drive product adoption and vision.</li>\n<li>Enabling clients at scale through workshops and developing customer-facing collateral that helps increase technical knowledge and thought leadership.</li>\n<li>Influencing product roadmap by translating field-derived, data-driven insights into strategic recommendations for Product and Engineering teams.</li>\n</ul>\n<p>To succeed in this role, you will need:</p>\n<ul>\n<li>6+ years in a customer-facing, pre-sales or consulting role influencing technical executives, driving high-level data strategy and product adoption.</li>\n<li>Proven ability to co-plan large territories with Account Executives and operate in a highly coordinated, cross-functional effort across GTM and R&amp;D teams.</li>\n<li>Experience collaborating with Global System Integrators (GSIs) and third-party consulting organizations to drive customer outcomes.</li>\n<li>Proficient in programming, debugging, and problem-solving using SQL and Python.</li>\n<li>Hands-on experience building solutions within major public cloud environments (AWS, Azure, or GCP).</li>\n<li>Broad experience (in two or more) and understanding across the fields of data engineering, data warehousing, AI, ML, governance, transactional systems, app development, and streaming.</li>\n</ul>\n<p>If you are a motivated and experienced professional with a passion for data and AI, we encourage you to apply for this exciting opportunity.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_eda5b2b8-a68","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8407183002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Experience in designing and delivering cloud-based Data Visualisation and Analytics Solutions","Ability to advise customers in lakehouse analytics architecture","Certification and/or demonstrated competence in data visualisation and analytics systems along with one of Azure, AWS or GCP cloud providers","Demonstrated competence in the Lakehouse architecture including hands-on experience with Apache Spark, Python and SQL"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:50:38.084Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London, United Kingdom"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Experience in designing and delivering cloud-based Data Visualisation and Analytics Solutions, Ability to advise customers in lakehouse analytics architecture, Certification and/or demonstrated competence in data visualisation and analytics systems along with one of Azure, AWS or GCP cloud providers, Demonstrated competence in the Lakehouse architecture including hands-on experience with Apache Spark, Python and SQL"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_821d6af4-827"},"title":"Senior Solutions Architect - AI/BI","description":"<p>The Solutions Architect (AI/BI) team executes on Databricks&#39; strategic Product Operating Model to establish product market fit and set the course for rapid revenue growth.</p>\n<p>As a Senior Solutions Architect - AI/BI, you will collaborate with GTM leadership and account teams to design and execute high-impact engagement strategies across your territory.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Collaborating with GTM leadership and account teams to design and execute high-impact engagement strategies across your territory.</li>\n<li>Serving as a trusted advisor and expert Solutions Architect, building technical credibility with stakeholders to drive product adoption and vision.</li>\n<li>Enabling clients at scale through workshops and developing customer-facing collateral that helps increase technical knowledge and thought leadership.</li>\n<li>Influencing product roadmap by translating field-derived, data-driven insights into strategic recommendations for Product and Engineering teams.</li>\n</ul>\n<p>To be successful in this role, you will need:</p>\n<ul>\n<li>6+ years in a customer-facing, pre-sales or consulting role influencing technical executives, driving high-level data strategy and product adoption.</li>\n<li>Proven ability to co-plan large territories with Account Executives and operate in a highly coordinated, cross-functional effort across GTM and R&amp;D teams.</li>\n<li>Experience collaborating with Global System Integrators (GSIs) and third-party consulting organizations to drive customer outcomes.</li>\n<li>Proficient in programming, debugging, and problem-solving using SQL and Python.</li>\n<li>Hands-on experience building solutions within major public cloud environments (AWS, Azure, or GCP).</li>\n<li>Broad experience (in two or more) and understanding across the fields of data engineering, data warehousing, AI, ML, governance, transactional systems, app development, and streaming.</li>\n</ul>\n<p>Required skills include:</p>\n<ul>\n<li>Experience in designing and delivering cloud-based Data Visualisation and Analytics Solutions in a client or customer environment.</li>\n<li>Ability to advise customers in lakehouse analytics architecture.</li>\n<li>Certification and/or demonstrated competence in data visualisation and analytics systems along with one of Azure, AWS or GCP cloud providers.</li>\n<li>Demonstrated competence in the Lakehouse architecture including hands-on experience with Apache Spark, Python and SQL.</li>\n</ul>\n<p>Preferred skills include:</p>\n<ul>\n<li>Experience with Databricks products and services.</li>\n<li>Knowledge of data science and machine learning concepts.</li>\n</ul>\n<p>This is a senior-level role that requires a strong background in data and AI, as well as excellent communication and collaboration skills.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_821d6af4-827","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8437301002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Cloud-based Data Visualisation and Analytics Solutions","Lakehouse analytics architecture","Data visualisation and analytics systems","Apache Spark","Python","SQL"],"x-skills-preferred":["Databricks products and services","Data science and machine learning concepts"],"datePosted":"2026-04-18T15:49:47.877Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Munich, Germany"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Cloud-based Data Visualisation and Analytics Solutions, Lakehouse analytics architecture, Data visualisation and analytics systems, Apache Spark, Python, SQL, Databricks products and services, Data science and machine learning concepts"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_205a5f25-1f0"},"title":"Senior Manager, Infrastructure Data Science","description":"<p>Databricks is looking for a Senior Manager, Infrastructure Data Science to shape the future of Databricks infrastructure through data science. You will tackle some of the most complex challenges related to capacity planning, performance optimisation, reliability engineering, infrastructure efficiency, and customer experience.</p>\n<p>At Databricks, we enable data teams to solve the world&#39;s toughest problems by building and running the world&#39;s best data and AI infrastructure platform.</p>\n<p>As a Senior Manager, Infrastructure Data Science, you will lead a team of data scientists and work directly in partnership with engineering leaders to empower them with data-driven insights and solutions.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Thought leadership and strategic guidance on infrastructure planning, balancing current needs with future growth projections to ensure scalability and cost-effectiveness.</li>\n<li>Promoting a data-driven approach to infrastructure decisions, influencing stakeholders across engineering, and supporting the use of data science insights for high-impact, aligned strategies.</li>\n<li>Implementing data-driven solutions to identify, predict, and mitigate infrastructure risks and failures, reducing downtime and improving system reliability and performance, directly impacting end-user satisfaction and operational continuity.</li>\n<li>Spearheading analyses to improve resource utilisation efficiency, identifying and eliminating inefficiencies across infrastructure usage, resulting in cost savings and optimised performance.</li>\n<li>Establishing data frameworks that empower support teams to troubleshoot and resolve product issues faster, decreasing response times and enhancing customer experience and support quality.</li>\n<li>Mentoring and managing a team of data scientists, instilling best practices in data science, engineering, and fostering a collaborative environment focused on innovative, scalable infrastructure solutions.</li>\n</ul>\n<p>We look for candidates with 10+ years of infrastructure data science, machine learning, advanced analytics experience in high-velocity, high-growth companies, as well as 5+ years of management experience hiring and developing teams.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_205a5f25-1f0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/7734812002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$228,600-$314,250 USD","x-skills-required":["infrastructure data science","machine learning","advanced analytics","data visualisation","data engineering","data modelling","big data technologies","leadership","communication"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:44:53.915Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"infrastructure data science, machine learning, advanced analytics, data visualisation, data engineering, data modelling, big data technologies, leadership, communication","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":228600,"maxValue":314250,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_214894f2-136"},"title":"Research Engineer, Virtual Collaborator (Cowork)","description":"<p>We are looking for a Research Engineer to help us train Claude specifically for virtual collaborator workflows. While Claude excels at general tasks, a lot of knowledge work requires targeted training on real organisational data and workflows.</p>\n<p>Your job will be to design and implement reinforcement learning (RL) environments that transform Claude into the best virtual collaborator, training on realistic tasks from navigating internal knowledge to creating financial models.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Training Claude on document manipulation with good taste, including understanding, enhancing, and co-creating (e.g., Office doc formats, data visualisation)</li>\n</ul>\n<ul>\n<li>Designing and implementing reinforcement learning pipelines targeted at virtual collaborator use cases (productivity, organisational navigation, vertical domains)</li>\n</ul>\n<ul>\n<li>Building and scaling our data creation platform for generating high-quality, open-ended tasks with domain experts and crowdworkers</li>\n</ul>\n<ul>\n<li>Integrating real organisational data to create realistic training environments</li>\n</ul>\n<ul>\n<li>Developing robust evaluation systems that maintain quality while avoiding reward hacking</li>\n</ul>\n<ul>\n<li>Partnering directly with product teams (e.g., Cowork, claude.ai) to ensure training aligns with product features</li>\n</ul>\n<p>You may be a good fit if you:</p>\n<ul>\n<li>Are a very experienced Python programmer who can quickly produce reliable, high-quality code that your teammates love using</li>\n</ul>\n<ul>\n<li>Have 5-8 years of strong machine learning experience</li>\n</ul>\n<ul>\n<li>Thrive at the intersection of research and product, with a pragmatic approach to solving real-world problems</li>\n</ul>\n<ul>\n<li>Are comfortable with ambiguity and can balance research rigor with shipping deadlines</li>\n</ul>\n<ul>\n<li>Enjoy collaborating across multiple teams (data operations, model training, product)</li>\n</ul>\n<ul>\n<li>Can context-switch between research problems and product engineering tasks</li>\n</ul>\n<ul>\n<li>Care about making AI genuinely helpful for everyday enterprise workflows</li>\n</ul>\n<p>Strong candidates may also have experience with:</p>\n<ul>\n<li>Creating RL envs for realistic tasks</li>\n</ul>\n<ul>\n<li>Reward modelling and preventing reward hacking</li>\n</ul>\n<ul>\n<li>Building human-in-the-loop training systems or crowdsourcing platforms</li>\n</ul>\n<ul>\n<li>Working with enterprise tools and APIs (Google Workspace, Microsoft Office, Slack, etc.)</li>\n</ul>\n<ul>\n<li>Developing evaluation frameworks for open-ended tasks</li>\n</ul>\n<ul>\n<li>Domain expertise in finance, legal, or healthcare workflows</li>\n</ul>\n<ul>\n<li>Creating scalable data pipelines with quality control mechanisms</li>\n</ul>\n<ul>\n<li>Translating product requirements into technical training objectives</li>\n</ul>\n<p>Deadline to apply: None. Applications will be reviewed on a rolling basis.</p>\n<p>The annual compensation range for this role is $500,000-$850,000 USD.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_214894f2-136","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://www.anthropic.com","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/4946308008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$500,000-$850,000 USD","x-skills-required":["Python","Machine Learning","Reinforcement Learning","Data Creation Platform","Data Visualisation","Enterprise Tools and APIs"],"x-skills-preferred":["Human-in-the-loop Training Systems","Crowdsourcing Platforms","Domain Expertise in Finance, Legal, or Healthcare Workflows"],"datePosted":"2026-04-18T15:44:42.622Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City, NY; San Francisco, CA; Seattle, WA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Machine Learning, Reinforcement Learning, Data Creation Platform, Data Visualisation, Enterprise Tools and APIs, Human-in-the-loop Training Systems, Crowdsourcing Platforms, Domain Expertise in Finance, Legal, or Healthcare Workflows","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":500000,"maxValue":850000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_49214f94-4ba"},"title":"Senior Manager, Infrastructure Data Science","description":"<p>We are looking for a Senior Manager, Infrastructure Data Science to shape the future of Databricks infrastructure through data science. You will tackle some of the most complex challenges related to capacity planning, performance optimisation, reliability engineering, infrastructure efficiency, and customer experience.</p>\n<p>As a Senior Manager, you will lead a team of data scientists and work directly in partnership with engineering leaders to empower them with data-driven insights and solutions. You will promote a data-driven approach to infrastructure decisions, influencing stakeholders across engineering, and support to leverage data science insights for high-impact, aligned strategies.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Thought leadership and strategic guidance on infrastructure planning, balancing current needs with future growth projections to ensure scalability and cost-effectiveness.</li>\n<li>Implement data-driven solutions to identify, predict, and mitigate infrastructure risks and failures, reducing downtime and improving system reliability and performance, directly impacting end-user satisfaction and operational continuity.</li>\n<li>Spearhead analyses to improve resource utilisation efficiency, identifying and eliminating inefficiencies across infrastructure usage, resulting in cost savings and optimised performance.</li>\n<li>Establish data frameworks that empower support teams to troubleshoot and resolve product issues faster, decreasing response times and enhancing customer experience and support quality.</li>\n<li>Mentor and manage a team of data scientists, instilling best practices in data science, engineering, and fostering a collaborative environment focused on innovative, scalable infrastructure solutions.</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>10+ years of infrastructure data science, machine learning, advanced analytics experience in high-velocity, high-growth companies.</li>\n<li>5+ years of management experience hiring and developing teams.</li>\n<li>Experience developing data science, analytics, and machine learning and AI products and capabilities in a cloud environment.</li>\n<li>Knowledge of statistics and rigorous analytical techniques.</li>\n<li>Experience with data visualisation tools, knowledge of data engineering, data modelling, and big data technologies.</li>\n<li>Leadership skills and experience to lead across functional and organisational lines.</li>\n<li>Strong communication skills to explain and evangelise analytics and data science to executives and the senior management team.</li>\n<li>Bias to action and passion for delivering high-quality data solutions.</li>\n<li>A passion for problem-solving and comfort with ambiguity.</li>\n<li>MS or Ph.D. in quantitative fields (Statistics, Math, CS or Engineering).</li>\n</ul>\n<p>Pay Range Transparency</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks anticipates utilising the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $228,600-$314,250 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_49214f94-4ba","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com/","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/7641390002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$228,600-$314,250 USD","x-skills-required":["infrastructure data science","machine learning","advanced analytics","cloud environment","statistics","data visualisation tools","data engineering","data modelling","big data technologies","leadership skills","communication skills","bias to action","passion for problem-solving","comfort with ambiguity","MS or Ph.D. in quantitative fields"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:43:09.520Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Mountain View, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"infrastructure data science, machine learning, advanced analytics, cloud environment, statistics, data visualisation tools, data engineering, data modelling, big data technologies, leadership skills, communication skills, bias to action, passion for problem-solving, comfort with ambiguity, MS or Ph.D. in quantitative fields","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":228600,"maxValue":314250,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a8cef3b8-9e4"},"title":"Sr Lead FP&A - Procurement","description":"<p>The Sr Lead FP&amp;A - Procurement role is responsible for supporting financial planning, analysis, and reporting related to supply chain operations.</p>\n<p>This role partners with procurement, logistics, and operations teams to optimize costs, improve efficiency, and drive data-driven decision-making across the supply chain.</p>\n<p>Key responsibilities include analysing supply chain costs, developing and maintaining financial models, monitoring key performance indicators, and preparing monthly financial reports.</p>\n<p>The ideal candidate will have a Bachelor&#39;s degree in finance, accounting, economics, supply chain management, or business administration, and 5-6 years&#39; experience in financial analysis, supply chain, logistics, or operations.</p>\n<p>Preferred qualifications include strong financial modelling and forecasting skills, experience with data visualisation tools, and knowledge of supply chain processes and logistics operations.</p>\n<p>The salary range for this role is $130,000 - $190,000 per year, with bonus, benefits, and equity included in the offer package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a8cef3b8-9e4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/950c7cc3-bd36-4654-9c2d-786d4dee114c","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$130,000 - $190,000 per year","x-skills-required":["Financial analysis","Supply chain","Logistics","Cost analysis","Budgeting","Inventory management","Procurement","Sourcing","Manufacturing","Distribution"],"x-skills-preferred":["Financial modelling","Forecasting","Data visualisation","Supply chain processes","Logistics operations","Cost accounting"],"datePosted":"2026-04-17T13:00:21.686Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dallas"}},"employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Technology","skills":"Financial analysis, Supply chain, Logistics, Cost analysis, Budgeting, Inventory management, Procurement, Sourcing, Manufacturing, Distribution, Financial modelling, Forecasting, Data visualisation, Supply chain processes, Logistics operations, Cost accounting","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":130000,"maxValue":190000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6fb4a600-300"},"title":"Senior People Systems and Data Analyst","description":"<p>Freenome is seeking a highly analytical and systems-savvy Senior People Systems and Data Analyst to elevate our People data capabilities and drive insight-led decision-making.</p>\n<p>This role will own the development of dashboards, analytics, and reporting that provide visibility into workforce trends and inform hiring, retention, engagement, and organisational planning strategies.</p>\n<p>While the role maintains ownership of People systems and data integrity, its primary focus is generating insights that enable better business decisions.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Develop, analyse, and interpret workforce data to identify trends, risks, and opportunities across recruiting, retention, performance, and engagement.</li>\n<li>Conduct deep-dive analyses (attrition trends, hiring funnel performance, compensation insights, engagement drivers).</li>\n<li>Build statistical and trend analyses across the employee lifecycle to proactively surface insights that influence People and business decisions.</li>\n<li>Deliver actionable quarterly reporting that translates complex datasets into clear, executive-ready insights and recommendations.</li>\n<li>Design, build, and maintain scalable and automated dashboards.</li>\n<li>Establish consistent KPI definitions and reporting standards.</li>\n<li>Improve data visualisation practices to ensure clarity, usability, and impact.</li>\n<li>Replace manual reporting processes with automated, real-time reporting solutions.</li>\n<li>Ensure high levels of data accuracy, integrity, and trust.</li>\n<li>Serve as primary People data and HRIS reporting expert.</li>\n<li>Partner with IT and vendors to implement and manage integrations, evaluate new features, and identify opportunities to automate data workflows.</li>\n<li>Manage data governance standards to ensure compliance and data privacy best practices.</li>\n<li>Identify process improvements that enhance efficiency, data structure and accessibility.</li>\n<li>Enable tier-zero employee and manager self-service for policies, processes, and data through systems and chatbots, reducing reliance on People team members and ticket systems.</li>\n<li>Champion system education to drive employee and manager self-service adoption and data and systems utilisation.</li>\n<li>Drive AI initiatives such as deploying LLMs for job descriptions and ML to forecast attrition, ensuring tools are trained and continuously refined to be fit-for-purpose and remain within ethical and legal guardrails.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s degree in Statistics, Data Science, Business Analytics, HR, Economics, or related field.</li>\n<li>5–7 years of experience in People Analytics, Workforce Analytics, HRIS Analytics, or related analytical roles.</li>\n<li>Prior experience modifying, improving and/or implementing People systems (Greenhouse, Paylocity a bonus).</li>\n<li>Advanced proficiency in data visualisation tools (Tableau, Power BI, Looker, or similar).</li>\n<li>Strong experience building executive dashboards from the ground up.</li>\n<li>Knowledge of data, statistical, and predictive analysis.</li>\n<li>Advanced Excel/Google Sheets skills.</li>\n<li>Experience working with large, complex datasets.</li>\n<li>Demonstrated ability to translate data into clear business insights and recommendations.</li>\n<li>Strong understanding of core People processes (recruiting, compensation, performance, engagement, workforce planning).</li>\n<li>Excellent verbal and written communication skills.</li>\n<li>Strong analytical and problem-solving skills.</li>\n<li>Passion for driving continuous improvement simplifying workflows, implementing automation, and driving self-service.</li>\n<li>Demonstrated interest in leveraging AI to enhance systems and analytics.</li>\n</ul>\n<p>Nice to haves:</p>\n<ul>\n<li>Experience in a high-growth biotech or startup environment, with exposure to IPO preparation, public company reporting requirements, or SOX-compliant processes.</li>\n<li>Familiarity with Greenhouse, Paylocity, or other HRIS/ATS platforms.</li>\n<li>Exposure to predictive analytics or basic modelling techniques.</li>\n<li>SHRM-CP or SHRM-SCP certification.</li>\n</ul>\n<p>Benefits and additional information:</p>\n<p>The US target range of our base salary for new hires is $120,275 - $170,100. You will also be eligible to receive equity, cash bonuses, and a full range of medical, financial, and other benefits depending on the position offered.</p>\n<p>Please note that individual total compensation for this position will be determined at the Company’s sole discretion and may vary based on several factors, including but not limited to, location, skill level, years and depth of relevant experience, and education.</p>\n<p>We invite you to check out our career page @ freenome.com/job-openings/ for additional company information.</p>\n<p>Freenome is proud to be an equal-opportunity employer, and we value diversity. Freenome does not discriminate on the basis of race, colour, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law.</p>\n<p>Applicants have rights under Federal Employment Laws.</p>\n<p>Family &amp; Medical Leave Act (FMLA)</p>\n<p>Equal Employment Opportunity (EEO)</p>\n<p>Employee Polygraph Protection Act (EPPA)</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6fb4a600-300","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Freenome","sameAs":"https://freenome.com","logo":"https://logos.yubhub.co/freenome.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/freenome/jobs/8460884002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$120,275 - $170,100","x-skills-required":["data visualisation","statistics","data science","business analytics","HR","economics","People analytics","workforce analytics","HRIS analytics","Greenhouse","Paylocity","Tableau","Power BI","Looker","Excel","Google Sheets","predictive analysis","data governance","AI","LLMs","ML"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:36:29.001Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Brisbane, California"}},"employmentType":"FULL_TIME","occupationalCategory":"HR","industry":"biotech","skills":"data visualisation, statistics, data science, business analytics, HR, economics, People analytics, workforce analytics, HRIS analytics, Greenhouse, Paylocity, Tableau, Power BI, Looker, Excel, Google Sheets, predictive analysis, data governance, AI, LLMs, ML","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120275,"maxValue":170100,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_2b0442cb-a86"},"title":"Data Scientist","description":"<p>Join the Portfolio Data Science team, part of Subscriptions Product Insights, to drive the growth and evolution of Spotify&#39;s subscription business.</p>\n<p>As a Data Scientist, you&#39;ll help the Portfolio team make informed decisions about how to grow and diversify Spotify&#39;s paid offerings by analysing subscriber behaviour, measuring the impact of plan changes and new propositions, and surfacing insights that shape our strategy.</p>\n<p>Your work will directly influence how the Spotify experience evolves for users around the world.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Analyse subscriber behaviour and engagement patterns to identify opportunities for plan evolution and new value propositions</li>\n<li>Design and analyse experiments to measure the impact of changes to subscription plans and add-on offerings</li>\n<li>Define and maintain success metrics for Portfolio initiatives, ensuring alignment across cross-functional partners</li>\n<li>Build and maintain dashboards and reporting to monitor the health and performance of subscription plans and new revenue streams</li>\n<li>Perform exploratory analyses that uncover growth levers and inform the Portfolio roadmap</li>\n<li>Collaborate with Product, Engineering, Design, User Research and other Data Science teams across the Subscriptions Mission</li>\n<li>Communicate findings and recommendations clearly to both technical and non-technical audiences, including senior leadership</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>3+ years of experience in data science or analytics, ideally at a consumer tech or subscription-based company</li>\n<li>Hold a degree in Statistics, Economics, Computer Science, Mathematics, or a related quantitative field (or equivalent experience)</li>\n<li>Proficient in SQL and Python and comfortable working with large-scale datasets</li>\n<li>Experience designing and analysing A/B tests, with a solid understanding of statistical methods</li>\n<li>Clear communicator who can translate complex analyses into actionable recommendations</li>\n<li>Curious, proactive, and comfortable navigating ambiguity in a fast-moving business</li>\n<li>Enjoy collaborating across teams and are comfortable working in environments that require coordination across multiple stakeholders and workstreams</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Health insurance</li>\n<li>Six month paid parental leave</li>\n<li>401(k) retirement plan</li>\n<li>Monthly meal allowance</li>\n<li>23 paid days off</li>\n<li>Paid flexible holidays</li>\n<li>Paid sick leave</li>\n</ul>\n<p>Salary Range: $76,840 - $109,771, plus equity</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_2b0442cb-a86","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Spotify","sameAs":"https://www.spotify.com","logo":"https://logos.yubhub.co/spotify.com.png"},"x-apply-url":"https://jobs.lever.co/spotify/a95830ad-c11f-49da-85a7-04ce47ce532c","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$76,840 - $109,771, plus equity","x-skills-required":["SQL","Python","A/B testing","Statistical methods","Data analysis","Data visualisation"],"x-skills-preferred":["Machine learning","Cloud computing","Big data processing"],"datePosted":"2026-03-31T18:17:27.987Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, Stockholm or London"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, A/B testing, Statistical methods, Data analysis, Data visualisation, Machine learning, Cloud computing, Big data processing","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":76840,"maxValue":109771,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4b700ee3-482"},"title":"Analytics Engineer (Finance)","description":"<p>We are looking for an Analytics Engineer to join our team. As an Analytics Engineer, you will be responsible for translating data requirements from across the organisation into robust and reusable data models, with a particular focus on financial regulatory submissions or financial analytics.</p>\n<p>Maintain consistent and clear documentation and communicate with business stakeholders (both technical and non-technical).</p>\n<p>Collaborate with the wider data team to help meet the business goals, including peer reviews.</p>\n<p>Take ownership of a project end-to-end and manage priorities accordingly.</p>\n<p>Our ideal candidate will have strong experience with SQL, experience working within the credit domain, and be a self-starter with the ability to think outside the box.</p>\n<p>They will also have good attention to detail, strong experience with Looker or a similar visualisation tool, and strong communication and documentation skills for both technical and non-technical audiences.</p>\n<p>As a member of our team, you will have the opportunity to work on a wide range of projects and contribute to the development of our data capabilities.</p>\n<p>We offer a competitive salary and benefits package, including 25 days holiday, an extra day&#39;s holiday for your birthday, and annual leave increased with length of service.</p>\n<p>We are an equal opportunities employer and welcome applications from all qualified candidates.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4b700ee3-482","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Starling Bank","sameAs":"https://www.starlingbank.com/","logo":"https://logos.yubhub.co/starlingbank.com.png"},"x-apply-url":"https://apply.workable.com/j/D74D88F51C","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Looker","credit domain","data modelling","financial analytics"],"x-skills-preferred":["dbt","data visualisation"],"datePosted":"2026-03-20T16:15:09.160Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Southampton"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, Looker, credit domain, data modelling, financial analytics, dbt, data visualisation"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ea916227-08d"},"title":"Data Science Consultant Intern","description":"<p><strong>Data Science Consultant Intern</strong></p>\n<p>You will join Fifty-Five as a Data Science Consultant Intern, working closely with our team of experts to help clients make data-driven decisions. As a Data Science Consultant Intern, you will be responsible for extracting and analysing data, defining key performance indicators, creating data visualisations and developing data-driven solutions.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Participate in client meetings to understand their needs</li>\n<li>Contribute to the development of projects within a dedicated team</li>\n<li>Extract and analyse relevant data according to client needs</li>\n<li>Define key performance indicators to address specific problems</li>\n<li>Create data visualisations to monitor and analyse performance</li>\n<li>Develop data-driven solutions</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>You are a student in a computer science or related field, with a strong background in data science and statistics</li>\n<li>You have a good understanding of SQL and data visualisation tools</li>\n<li>You are able to work independently and as part of a team</li>\n<li>You have excellent communication and problem-solving skills</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Previous experience in marketing, data or consulting</li>\n<li>Strong knowledge of web technologies and multimedia</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>250 employees in Paris and over 320 globally</li>\n<li>Multicultural environment with over 20 nationalities represented</li>\n<li>Excellent working conditions and a dynamic work environment</li>\n<li>Opportunities for professional growth and development</li>\n<li>Competitive salary and benefits package</li>\n</ul>\n<p><strong>What&#39;s Next?</strong></p>\n<p>After completing your internship, you will have the opportunity to join Fifty-Five as a full-time employee, with opportunities for career advancement and professional growth.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ea916227-08d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Fifty-Five","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/gysZYfVU1nwWjdnGE7pcSm/data-science-consultant-intern-(h%2Ff)-in-paris-at-fifty-five","x-work-arrangement":"onsite","x-experience-level":"entry","x-job-type":"internship","x-salary-range":"€1200-€1400 per month","x-skills-required":["SQL","data visualisation","data analysis","problem-solving","communication"],"x-skills-preferred":["marketing","data","consulting","web technologies","multimedia"],"datePosted":"2026-03-09T17:01:58.134Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Paris"}},"employmentType":"INTERN","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data visualisation, data analysis, problem-solving, communication, marketing, data, consulting, web technologies, multimedia","baseSalary":{"@type":"MonetaryAmount","currency":"EUR","value":{"@type":"QuantitativeValue","minValue":1200,"maxValue":1400,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9aaf497d-8cf"},"title":"Senior Data Consultant (H/F)","description":"<p>A senior data consultant is needed to join the team at Fifty-Five. The successful candidate will be responsible for promoting data science and data processing to support digital marketing, following a project plan with different milestones, ensuring data quality and accuracy, and delivering high-quality results to clients.</p>\n<p>The role will involve working closely with the technical and business teams to deliver marketing-oriented AI cases, integrating business requirements into a relevant activation strategy. The consultant will also be responsible for analysing data for clients, responding to digital activity management issues using key performance indicators, and participating in the development of Fifty-Five&#39;s data science offer.</p>\n<p>The ideal candidate will have a degree in engineering or a related field, with at least 2 years of experience in data consulting. They will have strong knowledge of Microsoft Office, a strong analytical mindset, excellent communication skills, and a commercial spirit.</p>\n<p>The company offers a range of benefits, including a competitive salary, a 10-euro daily meal ticket, 50% transport costs, a flexible work-life balance, and a modern and stimulating work environment.</p>\n<p>Fifty-Five is committed to diversity and inclusion, and welcomes applications from all candidates, regardless of their background, age, gender, or disability.</p>\n<p>The company is part of The Brandtech Group, a global data company that helps brands collect, analyse and activate their data across paid, earned and owned channels to increase their marketing ROI and improve customer acquisition and retention.</p>\n<p>The role is based in Paris, with the opportunity to work on a global scale. The company is looking for a senior data consultant to join its team, with a competitive salary and a range of benefits.</p>\n<p>If you are a senior data consultant looking for a new challenge, please apply now.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Promote data science and data processing to support digital marketing</li>\n<li>Follow a project plan with different milestones</li>\n<li>Ensure data quality and accuracy</li>\n<li>Deliver high-quality results to clients</li>\n<li>Work closely with technical and business teams to deliver marketing-oriented AI cases</li>\n<li>Integrate business requirements into a relevant activation strategy</li>\n<li>Analyse data for clients</li>\n<li>Respond to digital activity management issues using key performance indicators</li>\n<li>Participate in the development of Fifty-Five&#39;s data science offer</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>Degree in engineering or a related field</li>\n<li>At least 2 years of experience in data consulting</li>\n<li>Strong knowledge of Microsoft Office</li>\n<li>Strong analytical mindset</li>\n<li>Excellent communication skills</li>\n<li>Commercial spirit</li>\n</ul>\n<p><strong>Preferred skills:</strong></p>\n<ul>\n<li>Strong knowledge of Python</li>\n<li>Experience with data visualisation tools</li>\n<li>Knowledge of machine learning algorithms</li>\n<li>Experience with data warehousing and business intelligence tools</li>\n</ul>\n<p><strong>Benefits:</strong></p>\n<ul>\n<li>Competitive salary</li>\n<li>10-euro daily meal ticket</li>\n<li>50% transport costs</li>\n<li>Flexible work-life balance</li>\n<li>Modern and stimulating work environment</li>\n</ul>\n<p><strong>Company culture:</strong></p>\n<ul>\n<li>Fifty-Five is committed to diversity and inclusion</li>\n<li>Welcomes applications from all candidates, regardless of their background, age, gender, or disability</li>\n<li>Part of The Brandtech Group, a global data company that helps brands collect, analyse and activate their data across paid, earned and owned channels to increase their marketing ROI and improve customer acquisition and retention</li>\n</ul>\n<p><strong>Location:</strong></p>\n<ul>\n<li>Paris, France</li>\n</ul>\n<p><strong>Type of contract:</strong></p>\n<ul>\n<li>Full-time</li>\n</ul>\n<p><strong>Salary:</strong></p>\n<ul>\n<li>Competitive salary</li>\n</ul>\n<p><strong>Workplace type:</strong></p>\n<ul>\n<li>On-site</li>\n</ul>\n<p><strong>Category:</strong></p>\n<ul>\n<li>Data science</li>\n</ul>\n<p><strong>Industry:</strong></p>\n<ul>\n<li>Technology</li>\n</ul>\n<p><strong>Salary range:</strong></p>\n<ul>\n<li>Competitive salary</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9aaf497d-8cf","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Fifty-Five","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/ap7XkdJEREJccZrXVta5Fv/senior-data-consultant-(h%2Ff)-in-paris-at-fifty-five","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"Competitive salary","x-skills-required":["Microsoft Office","Data science","Data processing","Digital marketing","Machine learning","Data visualisation","Data warehousing","Business intelligence"],"x-skills-preferred":["Python","Data visualisation tools","Machine learning algorithms","Data warehousing and business intelligence tools"],"datePosted":"2026-03-09T17:01:51.734Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Paris"}},"employmentType":"FULL_TIME","occupationalCategory":"Data science","industry":"Technology","skills":"Microsoft Office, Data science, Data processing, Digital marketing, Machine learning, Data visualisation, Data warehousing, Business intelligence, Python, Data visualisation tools, Machine learning algorithms, Data warehousing and business intelligence tools"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_f867ca73-2e0"},"title":"Lead Data Consultant (H/F) Paris","description":"<p><strong>A leading data company in Paris</strong></p>\n<p>Fifty-Five is a global data company that helps brands collect, analyse and activate their data across paid, earned and owned channels to increase their marketing ROI and improve customer acquisition and retention.</p>\n<p>We are looking for a Lead Data Consultant to join our team in Paris. As a Lead Data Consultant, you will be responsible for leading data projects and working closely with our clients to understand their data needs and develop solutions to meet those needs.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Lead data projects from start to finish, including data collection, analysis and activation</li>\n<li>Work closely with clients to understand their data needs and develop solutions to meet those needs</li>\n<li>Collaborate with our data team to develop and implement data strategies</li>\n<li>Analyse data to identify trends and insights that can inform business decisions</li>\n<li>Develop and maintain relationships with clients to ensure their data needs are met</li>\n<li>Stay up-to-date with the latest data trends and technologies</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>3-6 years of experience in data analysis and consulting</li>\n<li>Strong understanding of data analysis and statistical techniques</li>\n<li>Experience working with large datasets and data visualisation tools</li>\n<li>Excellent communication and project management skills</li>\n<li>Ability to work independently and as part of a team</li>\n<li>Strong analytical and problem-solving skills</li>\n<li>Experience working with data platforms such as Google Analytics and Google Cloud Platform</li>\n<li>Strong understanding of data privacy and security regulations</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salary and benefits package</li>\n<li>Opportunity to work with a leading data company in Paris</li>\n<li>Collaborative and dynamic work environment</li>\n<li>Professional development opportunities</li>\n<li>Flexible working hours and remote work options</li>\n<li>Access to the latest data tools and technologies</li>\n<li>Opportunity to work on a variety of data projects and clients</li>\n<li>Recognition and rewards for outstanding performance</li>\n</ul>\n<p><strong>How to Apply</strong></p>\n<p>If you are a motivated and experienced data professional looking for a new challenge, please submit your application, including your resume and a cover letter, to [insert contact information]. We look forward to hearing from you!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_f867ca73-2e0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Fifty-Five","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/wPj5jcg35AZgUXYdKWsC6a/lead-data-consultant-(h%2Ff)-paris-in-paris-at-fifty-five","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data analysis","data visualisation","data strategy","data privacy","data security","Google Analytics","Google Cloud Platform"],"x-skills-preferred":["data science","machine learning","data engineering","data architecture"],"datePosted":"2026-03-09T16:54:40.887Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Paris"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data analysis, data visualisation, data strategy, data privacy, data security, Google Analytics, Google Cloud Platform, data science, machine learning, data engineering, data architecture"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_b1d522e6-6ca"},"title":"Data Engineering & Data Science Consultant","description":"<p><strong>Data Engineering &amp; Data Science Consultant</strong></p>\n<p>You will work hands-on on the design, build, and operationalisation of modern data and analytics solutions. You will contribute across the full lifecycle – from data ingestion and transformation to analytics, machine learning, and production deployment. You will collaborate closely with data engineers, architects, data scientists, and business stakeholders to deliver scalable, reliable, and value-driven data solutions in complex client environments.</p>\n<p><strong>Your role will include:</strong></p>\n<ul>\n<li>Applying data science and machine learning techniques to real-world business problems</li>\n<li>Working with structured and semi-structured data in data lakes, lakehouses, and data warehouses</li>\n<li>Developing and optimising data transformations for analytical and machine learning workloads</li>\n<li>Supporting the productionisation of data and ML solutions, including monitoring and optimisation</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>3–5 years of experience in data engineering, data science, or analytics</li>\n<li>Hands-on experience delivering data and analytics solutions in project-based or client environments</li>\n<li>Strong problem-solving skills and a pragmatic, delivery-oriented mindset</li>\n</ul>\n<p><strong>Data Engineering Foundations</strong></p>\n<ul>\n<li>Experience building end-to-end data pipelines (ingestion, transformation, storage)</li>\n<li>Solid understanding of data modelling, data transformations, and feature engineering</li>\n<li>Familiarity with cloud-based data platforms, such as Azure, AWS, or GCP</li>\n<li>Understanding of CI/CD concepts and production-grade deployments</li>\n</ul>\n<p><strong>Applied Data Science &amp; Analytics</strong></p>\n<ul>\n<li>Experience applying statistical analysis and machine learning techniques</li>\n<li>Strong programming skills in Python</li>\n<li>Very good SQL skills and experience working with relational databases</li>\n<li>Experience deploying or supporting ML models in production environments</li>\n<li>Ability to translate analytical results into business-relevant insights</li>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>\n</ul>\n<p><strong>Nice to have</strong></p>\n<ul>\n<li>Experience with streaming technologies (e.g. Kafka, Azure Event Hubs)</li>\n<li>Exposure to GenAI, NLP, time series, or advanced analytics use cases</li>\n<li>Experience with NoSQL databases (e.g. MongoDB, Cosmos DB)</li>\n<li>Familiarity with Docker and Kubernetes</li>\n<li>Experience with data visualisation tools (e.g. Power BI, Tableau)</li>\n<li>Cloud or data-related certifications</li>\n</ul>\n<p><strong>Language &amp; Mobility</strong></p>\n<ul>\n<li>Very good English skills</li>\n<li>Willingness to travel for project-related work</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>\n<p><strong>About Infosys Consulting</strong></p>\n<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>\n<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>\n<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>\n<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_b1d522e6-6ca","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Infosys Consulting - Europe","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/sJqhkr23sMG2F6ppqk2BQn/remote-data-engineering-%26-data-science-consultant-in-poland-at-infosys-consulting---europe","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data science","machine learning","data engineering","data analytics","cloud-based data platforms","Azure","AWS","GCP","Python","SQL","relational databases","data visualisation tools","Power BI","Tableau"],"x-skills-preferred":["streaming technologies","GenAI","NLP","time series","advanced analytics","NoSQL databases","Docker","Kubernetes"],"datePosted":"2026-03-09T16:51:38.126Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Poland"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data science, machine learning, data engineering, data analytics, cloud-based data platforms, Azure, AWS, GCP, Python, SQL, relational databases, data visualisation tools, Power BI, Tableau, streaming technologies, GenAI, NLP, time series, advanced analytics, NoSQL databases, Docker, Kubernetes"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6f741e47-003"},"title":"Finance","description":"<p>You will work in the finance department of the Atlassian Williams F1 Team. The team is responsible for managing the financial aspects of the organisation, including budgeting, forecasting, and financial reporting. Your role will be to support the finance team in their day-to-day activities, including data analysis, financial modelling, and financial planning.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Support the finance team in their day-to-day activities, including data analysis, financial modelling, and financial planning.</li>\n<li>Assist in the preparation of financial reports, including balance sheets, income statements, and cash flow statements.</li>\n<li>Analyse financial data to identify trends and areas for improvement.</li>\n<li>Develop and maintain financial models to support business decisions.</li>\n<li>Collaborate with other teams, including operations and management, to ensure financial information is accurate and up-to-date.</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Opportunity to work with a leading finance organisation in the motorsport industry.</li>\n<li>Collaborative and dynamic work environment.</li>\n<li>Professional development opportunities.</li>\n<li>Competitive salary and benefits package.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6f741e47-003","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Atlassian Williams F1 Team","sameAs":"https://careers.williamsf1.com","logo":"https://logos.yubhub.co/careers.williamsf1.com.png"},"x-apply-url":"https://careers.williamsf1.com/job/tax-assistant-in-grove-wantage-jid-485","x-work-arrangement":"onsite","x-experience-level":"entry","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["financial analysis","financial modelling","financial planning","data analysis","financial reporting"],"x-skills-preferred":["Excel","financial software","data visualisation"],"datePosted":"2026-03-09T10:07:09.339Z","employmentType":"FULL_TIME","occupationalCategory":"Finance","industry":"Motorsport","skills":"financial analysis, financial modelling, financial planning, data analysis, financial reporting, Excel, financial software, data visualisation"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8516ca2f-5df"},"title":"Data Science Engineer, Capacity & Efficiency","description":"<p><strong>About the Role</strong></p>\n<p>As a member of the Compute team, you will play a critical role in Anthropic&#39;s mission of building safe and beneficial AI by ensuring we understand, optimize, and strategically manage our cloud infrastructure spend. Your work will directly impact how efficiently we operate our multi-cloud and datacenter footprint, from forecasting infrastructure needs and planning capacity, to driving utilization improvements and reducing unit costs across our compute, storage, and networking resources.</p>\n<p>You will work closely with Compute Finance, Infrastructure Engineers, and Product to translate raw cloud billing data into actionable efficiency insights and influence capacity planning &amp; allocation. You will help build deep visibility into our infrastructure spend, forecast capacity needs, attribute costs accurately across teams and workloads, model resource demand curves, and help identify efficiency opportunities across our fleet.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Build and maintain cloud cost attribution models that accurately allocate infrastructure spend (compute, accelerators, storage, networking, data transfer) across teams, products, and workloads, providing clear visibility into who is spending what and why.</li>\n</ul>\n<ul>\n<li>Build and maintain cost of revenue pipelines and models</li>\n</ul>\n<ul>\n<li>Partner with infrastructure, finance, and procurement stakeholders to analyse utilization patterns, identify inefficiencies, and drive optimization initiatives that improve the cost-effectiveness of our non-accelerator cloud resources.</li>\n</ul>\n<ul>\n<li>Develop forecasting models for non-accelerator infrastructure demand, incorporating business growth projections, product roadmaps, and historical spend trends to enable proactive capacity planning and budget accuracy.</li>\n</ul>\n<ul>\n<li>Define and track unit cost metrics (e.g., cost per request, cost per GB stored, cost per pipeline run) and identify opportunities to reduce them, influencing infrastructure and engineering roadmaps with data-driven recommendations.</li>\n</ul>\n<ul>\n<li>Develop unit cost economics for various workloads and applications, and using the metrics to drive efficiency efforts across product and infrastructure teams.</li>\n</ul>\n<ul>\n<li>Build a cost-aware culture across the organisation by creating self-serve dashboards, automated reporting, and accessible datasets that give engineering and finance teams clear visibility into cloud spend and efficiency metrics.</li>\n</ul>\n<p><strong>You might be a good fit if you have:</strong></p>\n<ul>\n<li>6+ years of experience in data science, analytics, or FinOps roles, with a focus on cloud infrastructure cost analysis, capacity planning, or efficiency optimisation.</li>\n</ul>\n<ul>\n<li>Experience building spend forecasting models and large-scale cost attribution systems.</li>\n</ul>\n<ul>\n<li>Deep knowledge of cloud billing systems, cost allocation methodologies, and spend optimisation levers (e.g., reserved instances, committed use discounts, rightsizing, spot/preemptible usage).</li>\n</ul>\n<ul>\n<li>A passion for the company&#39;s mission of building helpful, honest, and harmless AI.</li>\n</ul>\n<ul>\n<li>Expertise in Python, SQL, forecasting, data modelling and data visualisation tools.</li>\n</ul>\n<ul>\n<li>A bias for action and urgency, not letting perfect be the enemy of the effective.</li>\n</ul>\n<ul>\n<li>A strong disposition to thrive in ambiguity, taking initiative to create clarity and forward progress.</li>\n</ul>\n<ul>\n<li>A deep curiosity and energy for pulling the thread on hard questions.</li>\n</ul>\n<ul>\n<li>Experience in turning open questions and data into concise and insightful analysis.</li>\n</ul>\n<ul>\n<li>Highly effective written communication and presentation skills.</li>\n</ul>\n<p><strong>Logistics</strong></p>\n<p><strong>Education requirements:</strong> We require at least a Bachelor&#39;s degree in a related field or equivalent experience. <strong>Location-based hybrid policy:</strong> Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices.</p>\n<p><strong>Visa sponsorship:</strong> We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>\n<p><strong>We encourage you to apply even if you do not believe you meet every single qualification.</strong> Not all strong candidates will meet every single qualification as listed. Research shows that people who identify as being from underrepresented groups are more prone to experiencing imposter syndrome and doubting the strength of their candidacy, so we urge you not to exclude yourself prematurely and to submit an application if you&#39;re interested in this work.</p>\n<p><strong>Your safety matters to us.</strong> To protect yourself from potential scams, remember that Anthropic recruiters only contact you from @anthropic.com email addresses. In some cases, we may partner with vetted recruiting agencies who will identify themselves as working on behalf of Anthropic.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8516ca2f-5df","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://job-boards.greenhouse.io","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/5125881008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$275,000 - $370,000 USD","x-skills-required":["cloud infrastructure cost analysis","capacity planning","efficiency optimisation","Python","SQL","forecasting","data modelling","data visualisation"],"x-skills-preferred":["reserved instances","committed use discounts","rightsizing","spot/preemptible usage"],"datePosted":"2026-03-08T13:59:33.909Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City, NY; San Francisco, CA; Seattle, WA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"cloud infrastructure cost analysis, capacity planning, efficiency optimisation, Python, SQL, forecasting, data modelling, data visualisation, reserved instances, committed use discounts, rightsizing, spot/preemptible usage","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":275000,"maxValue":370000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_442b4d5e-4a8"},"title":"Research Engineer, Virtual Collaborator (Cowork)","description":"<p><strong>About the role</strong></p>\n<p>We are looking for a Research Engineer to help us train Claude specifically for virtual collaborator workflows. While Claude excels at general tasks, a lot of knowledge work requires targeted training on real organisational data and workflows. Your job will be to design and implement reinforcement learning (RL) environments that transform Claude into the best virtual collaborator, training on realistic tasks from navigating internal knowledge to creating financial models.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Training Claude on document manipulation with good taste, including understanding, enhancing, and co-creating (e.g., Office doc formats, data visualisation)</li>\n<li>Designing and implementing reinforcement learning pipelines targeted at virtual collaborator use cases (productivity, organisational navigation, vertical domains)</li>\n<li>Building and scaling our data creation platform for generating high-quality, open-ended tasks with domain experts and crowdworkers Integrating real organisational data to create realistic training environments</li>\n<li>Developing robust evaluation systems that maintain quality while avoiding reward hacking</li>\n<li>Partnering directly with product teams (e.g., Cowork, claude.ai) to ensure training aligns with product features</li>\n</ul>\n<p><strong>You may be a good fit if you:</strong></p>\n<ul>\n<li>Are a very experienced Python programmer who can quickly produce reliable, high-quality code that your teammates love using</li>\n<li>Have 5-8 years of strong machine learning experience</li>\n<li>Thrive at the intersection of research and product, with a pragmatic approach to solving real-world problems</li>\n<li>Are comfortable with ambiguity and can balance research rigor with shipping deadlines</li>\n<li>Enjoy collaborating across multiple teams (data operations, model training, product)</li>\n<li>Can context-switch between research problems and product engineering tasks</li>\n<li>Care about making AI genuinely helpful for everyday enterprise workflows</li>\n</ul>\n<p><strong>Strong candidates may also have experience with:</strong></p>\n<ul>\n<li>Creating RL envs for realistic tasks.</li>\n<li>Reward modelling and preventing reward hacking</li>\n<li>Building human-in-the-loop training systems or crowdsourcing platforms</li>\n<li>Working with enterprise tools and APIs (Google Workspace, Microsoft Office, Slack, etc.)</li>\n<li>Developing evaluation frameworks for open-ended tasks</li>\n<li>Domain expertise in finance, legal, or healthcare workflows</li>\n<li>Creating scalable data pipelines with quality control mechanisms</li>\n<li>Translating product requirements into technical training objectives</li>\n</ul>\n<p><strong>Deadline to apply:</strong></p>\n<p>None. Applications will be reviewed on a rolling basis.</p>\n<p><strong>Logistics</strong></p>\n<p><strong>Education requirements:</strong> We require at least a Bachelor&#39;s degree in a related field or equivalent experience. <strong>Location-based hybrid policy:</strong> Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices.</p>\n<p><strong>Visa sponsorship:</strong></p>\n<p>We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>\n<p><strong>We encourage you to apply even if you do not believe you meet every single qualification.</strong></p>\n<p>Not all strong candidates will meet every single qualification as listed. Research shows that people who identify as being from underrepresented groups are more prone to experiencing imposter syndrome and doubting the strength of their candidacy, so we urge you not to exclude yourself prematurely and to submit an application if you&#39;re interested in this work.</p>\n<p><strong>Your safety matters to us.</strong></p>\n<p>To protect yourself from potential scams, remember that Anthropic recruiters only contact you from @anthropic.com email addresses. In some cases, we may partner with vetted recruiting agencies who will identify themselves as working on behalf of Anthropic. Be cautious of emails from other domains. Legitimate Anthropic recruiters will never ask for money, fees, or banking information before your first day. If you&#39;re ever unsure about a communication, don&#39;t click any links—visit anthropic.com/careers directly for confirmed position openings.</p>\n<p><strong>How we&#39;re different</strong></p>\n<p>We believe that the highest-impact AI research will be big science. At Anthropic we work as a single cohesive team on just a few large-scale research efforts. And we value impact — advancing our long-term goals of steerable, trustworthy AI — rather than work on smaller and more specific puzzles. We view AI research as an empirical science, which has as much in common with physics and biology as with traditional efforts in computer science. We&#39;re an extremely collaborative group, and we host frequent research discussions to ensure that we are pursuing the highest-impact work at any given time. As such, we greatly value communication skills.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_442b4d5e-4a8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://job-boards.greenhouse.io","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/4946308008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$500,000 - $850,000 USD","x-skills-required":["Python","Machine learning","Reinforcement learning","Data visualisation","Enterprise tools and APIs"],"x-skills-preferred":["Human-in-the-loop training systems","Crowdsourcing platforms","Domain expertise in finance, legal, or healthcare workflows"],"datePosted":"2026-03-08T13:46:25.630Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City, NY; San Francisco, CA; Seattle, WA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Machine learning, Reinforcement learning, Data visualisation, Enterprise tools and APIs, Human-in-the-loop training systems, Crowdsourcing platforms, Domain expertise in finance, legal, or healthcare workflows","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":500000,"maxValue":850000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8ae6102f-700"},"title":"GRC Automation Engineering Lead","description":"<p><strong>About the Role</strong></p>\n<p>We are seeking a GRC Automation Lead to join our GRC organisation and build the technical foundation for how we scale our risk and compliance programs. In this role, you will lead the team that designs and implements automated workflows, data pipelines, and integrations that transform manual compliance processes into scalable engineering systems.</p>\n<p>This is a greenfield opportunity to establish the team, architecture, and integrations that will define how we approach governance, risk, and compliance at Anthropic. The core challenge is a data problem: compliance information lives across dozens of systems—cloud infrastructure, identity providers, HR platforms, ticketing tools, code repositories—and your job is to design systems that bring it together, normalise it, and make it actionable.</p>\n<p>At Anthropic, you&#39;ll also have a unique advantage: the ability to design AI-powered workflows where Claude acts as an extension of your team, handling tasks that would traditionally require additional headcount or manual effort. You&#39;ll need ingenuity to identify where agentic AI can accelerate evidence collection, interpret unstructured data, triage compliance gaps, and augment human judgment in risk assessments.</p>\n<p>Working closely with Security, IT, and Engineering teams, you&#39;ll translate compliance and regulatory requirements into solutions that support audit programs including SOC 2, ISO, HIPAA, and FedRAMP, building systems that combine traditional automation with AI capabilities to achieve scale that wouldn&#39;t otherwise be possible.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Lead the team that establishes foundational GRC processes and architecture. Design and build automated workflows for risk management and compliance, creating scalable systems that enable continuous monitoring as Anthropic grows.</li>\n</ul>\n<ul>\n<li>Build data pipelines that aggregate risk, control, and asset information from across our technology stack. This means solving hard data integration problems: mapping disparate schemas, handling inconsistent data quality, and creating unified views of compliance posture through dashboards and reporting tools.</li>\n</ul>\n<ul>\n<li>Inform GRC platform strategy and implementation: in partnership with other programs, evaluate, select, and deploy tooling that meets our compliance requirements.</li>\n</ul>\n<ul>\n<li>Translate written policies and compliance requirements into policy-as-code—working with Engineering and Security teams to express requirements as enforceable rules, automated checks, and continuous validation rather than static documents.</li>\n</ul>\n<ul>\n<li>Establish feedback loops between policy and implementation: surface where technical controls diverge from written requirements, identify where policies need to evolve based on infrastructure realities, and ensure that compliance requirements are expressed in terms engineers can act on.</li>\n</ul>\n<ul>\n<li>Design and deploy agentic AI workflows that extend team capacity, using Claude to automate evidence analysis, monitor control effectiveness, draft audit responses, interpret policy documents, and handle other tasks that require reasoning over unstructured information.</li>\n</ul>\n<ul>\n<li>Design and maintain integrations connecting GRC tooling with cloud infrastructure, identity management systems, HRIS platforms, ticketing systems, version control, and CI/CD pipelines—working with engineers to implement integrations that enable automated evidence collection and continuous compliance validation.</li>\n</ul>\n<ul>\n<li>Build and lead the GRC Automation function as we scale: hiring team members, establishing practices, and defining the technical roadmap for governance and compliance automation at Anthropic.</li>\n</ul>\n<p><strong>You may be a good fit if you:</strong></p>\n<ul>\n<li>Have 3-4+ years of experience managing technical individual contributors or systems-focused teams, with a proven track record of building or scaling small teams (2-5 people) in security, compliance, automation, or operations functions.</li>\n</ul>\n<ul>\n<li>Are a systems thinker first. You understand how complex environments work: how data flows between systems, where integration points exist, what breaks when systems don&#39;t talk to each other. Your strength is designing the right architecture and environment for security monitoring, not necessarily implementing it yourself.</li>\n</ul>\n<ul>\n<li>Have 5+ years of experience designing automated workflows, data pipelines, or system integrations, whether through traditional development, low-code platforms, GRC tools, or process automation. We care about your ability to solve integration problems, not your programming language proficiency.</li>\n</ul>\n<ul>\n<li>Proficiency to write production level code in at least one programming language (e.g., Python, Rust, Go)</li>\n</ul>\n<ul>\n<li>Have a relentless focus on data integration: you understand how to pull data from multiple sources, normalise it, join it meaningfully, and surface insights. You&#39;re comfortable reasoning about messy, inconsistent data and designing systems that handle edge cases gracefully.</li>\n</ul>\n<ul>\n<li>Understand APIs and integration patterns conceptually: REST APIs, webhooks, authentication flows, polling vs. push architectures, and can evaluate systems based on how well they expose data and support automation, even if you&#39;re not writing the integration code yourself.</li>\n</ul>\n<ul>\n<li>Can work independently with minimal guidance, taking ownership of complex problems from design through implementation while managing ambiguity inherent in early-stage programs.</li>\n</ul>\n<ul>\n<li>Have strong analytical and problem-solving skills, with the ability to break down complex problems into manageable parts and develop creative solutions.</li>\n</ul>\n<ul>\n<li>Are able to communicate complex technical ideas to both technical and non-technical stakeholders, with a strong focus on collaboration and teamwork.</li>\n</ul>\n<ul>\n<li>Are passionate about staying up-to-date with industry trends and emerging technologies, with a willingness to learn and adapt to new tools and techniques.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8ae6102f-700","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anthropic","sameAs":"https://job-boards.greenhouse.io","logo":"https://logos.yubhub.co/anthropic.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/anthropic/jobs/4980335008","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["GRC","Automation","Data Pipelines","System Integrations","APIs","Integration Patterns","REST APIs","Webhooks","Authentication Flows","Polling vs. Push Architectures","Data Integration","Data Normalisation","Data Joining","Data Modelling","Data Analysis","Data Visualisation","Agile Methodologies","Scrum","Kanban","Continuous Integration","Continuous Deployment","Continuous Monitoring","Cloud Infrastructure","Identity Providers","HR Platforms","Ticketing Tools","Code Repositories","Version Control","CI/CD Pipelines","GRC Tools","Policy-as-Code","Automated Checks","Continuous Validation","Feedback Loops","Policy Implementation","Technical Controls","Policy Evolution","Infrastructure Realities","Compliance Requirements","Engineer Communication","Technical Ideas","Collaboration","Teamwork","Industry Trends","Emerging Technologies","Learning","Adaptation","New Tools","New Techniques"],"x-skills-preferred":["Python","Rust","Go","Java","C++","JavaScript","TypeScript","SQL","NoSQL","Cloud Computing","DevOps","Security","Compliance","Risk Management","Audit Programs","SOC 2","ISO","HIPAA","FedRAMP","GRC Platforms","GRC Tools","Policy Management","Compliance Management","Risk Management","Audit Management","Compliance Automation","GRC Automation","Policy Automation","Compliance Orchestration","Risk Orchestration","Audit Orchestration","Compliance Intelligence","Risk Intelligence","Audit Intelligence","Compliance Analytics","Risk Analytics","Audit Analytics","Compliance Reporting","Risk Reporting","Audit Reporting","Compliance Dashboarding","Risk Dashboarding","Audit Dashboarding"],"datePosted":"2026-03-08T13:43:53.373Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA | New York City, NY | Seattle, WA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"GRC, Automation, Data Pipelines, System Integrations, APIs, Integration Patterns, REST APIs, Webhooks, Authentication Flows, Polling vs. Push Architectures, Data Integration, Data Normalisation, Data Joining, Data Modelling, Data Analysis, Data Visualisation, Agile Methodologies, Scrum, Kanban, Continuous Integration, Continuous Deployment, Continuous Monitoring, Cloud Infrastructure, Identity Providers, HR Platforms, Ticketing Tools, Code Repositories, Version Control, CI/CD Pipelines, GRC Tools, Policy-as-Code, Automated Checks, Continuous Validation, Feedback Loops, Policy Implementation, Technical Controls, Policy Evolution, Infrastructure Realities, Compliance Requirements, Engineer Communication, Technical Ideas, Collaboration, Teamwork, Industry Trends, Emerging Technologies, Learning, Adaptation, New Tools, New Techniques, Python, Rust, Go, Java, C++, JavaScript, TypeScript, SQL, NoSQL, Cloud Computing, DevOps, Security, Compliance, Risk Management, Audit Programs, SOC 2, ISO, HIPAA, FedRAMP, GRC Platforms, GRC Tools, Policy Management, Compliance Management, Risk Management, Audit Management, Compliance Automation, GRC Automation, Policy Automation, Compliance Orchestration, Risk Orchestration, Audit Orchestration, Compliance Intelligence, Risk Intelligence, Audit Intelligence, Compliance Analytics, Risk Analytics, Audit Analytics, Compliance Reporting, Risk Reporting, Audit Reporting, Compliance Dashboarding, Risk Dashboarding, Audit Dashboarding"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_85b77752-f3d"},"title":"Data Scientist","description":"<p><strong>About the role</strong></p>\n<p>As an early member of Cursor&#39;s data team, you&#39;ll help build an AI data program operating at incredible scale while partnering directly with founders and area leads. You&#39;ll work hands-on across the entire data stack and at the bleeding edge of AI, turning billions of user-AI interactions into strategy that gets users to &#39;aha&#39; faster and expands their usage.</p>\n<p><strong>What you&#39;ll do</strong></p>\n<ul>\n<li>Partner with area leads to shape data-informed decisions across the business.</li>\n<li>Define, track, and own metrics like power usage and user satisfaction that multiple teams and leadership depend on.</li>\n<li>Run experiments end-to-end: design, analyse, and translate into clear product recommendations.</li>\n<li>Build pipelines, dashboards, and analyses that make self-serve insights accessible and trustworthy.</li>\n<li>Establish data culture and foundations as an early member of the data team.</li>\n</ul>\n<p><strong>You may be a fit if</strong></p>\n<ul>\n<li>You have at least 2-4 years of full-time data science experience.</li>\n<li>You have a strong track record of shipping high-impact work when operating in ambiguity.</li>\n<li>You can turn complex data into clear insights and stories for engineers, PMs, and execs.</li>\n<li>You&#39;ve worked at a hyper-growth startup or research org—you know how to be scrappy and ship insights across multiple product areas.</li>\n<li>You&#39;re fluent in SQL, Python, and AB testing, and can write pipelines to unblock yourself.</li>\n</ul>\n<p><strong>Bonus points if</strong></p>\n<ul>\n<li>You have hands-on experience with dbt.</li>\n<li>You have experience working on productivity software or AI tooling.</li>\n</ul>\n<p>Name<em> Email</em> ↥ Upload file LinkedIn URL GitHub Profile</p>\n<p>Please write a short note on a project you&#39;re proud of:</p>\n<p>Will you now or in the future require visa sponsorship to work in the country where this position is located?</p>\n<p>Has someone at Cursor referred you for this role? If so, please include their email here</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_85b77752-f3d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Cursor","sameAs":"https://cursor.com","logo":"https://logos.yubhub.co/cursor.com.png"},"x-apply-url":"https://cursor.com/careers/data-scientist-agents","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","Python","AB testing","dbt","productivity software","AI tooling"],"x-skills-preferred":["data science","data analysis","data visualisation","machine learning"],"datePosted":"2026-03-08T00:15:25.434Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, AB testing, dbt, productivity software, AI tooling, data science, data analysis, data visualisation, machine learning"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_cbf4a173-e70"},"title":"Data Scientist","description":"<p>We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will play a key role in analysing and interpreting complex data sets to inform our racing strategy and improve our performance on the track.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Work closely with our racing team to understand their needs and develop data-driven solutions to improve their performance</li>\n<li>Develop and maintain complex data models and algorithms to analyse and interpret large data sets</li>\n<li>Collaborate with our data engineering team to design and implement data pipelines and architectures</li>\n<li>Communicate complex technical information to non-technical stakeholders, including our racing team and senior management</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>A strong background in data science, including experience with machine learning and statistical modelling</li>\n<li>Proficiency in programming languages such as Python and R</li>\n<li>Experience with data visualisation tools such as Tableau and Power BI</li>\n<li>Strong communication and interpersonal skills</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salary and benefits package</li>\n<li>Opportunity to work with a highly skilled and experienced team</li>\n<li>Access to cutting-edge technology and resources</li>\n<li>Flexible working hours and remote working options</li>\n<li>Regular social events and team-building activities</li>\n</ul>\n<p><strong>Preferred Qualifications</strong></p>\n<ul>\n<li>Experience working in a Formula One team or a similar high-performance environment</li>\n<li>Knowledge of racing strategy and tactics</li>\n<li>Experience with data visualisation tools such as Matplotlib and Seaborn</li>\n<li>Strong understanding of data governance and data quality principles</li>\n</ul>\n<p>If you are a highly motivated and skilled Data Scientist looking for a new challenge, please apply now.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_cbf4a173-e70","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Williams Racing","sameAs":"https://careers.williamsf1.com","logo":"https://logos.yubhub.co/careers.williamsf1.com.png"},"x-apply-url":"https://careers.williamsf1.com/job/ai-delivery-manager-in-grove-wantage-jid-514","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"Competitive salary and benefits package","x-skills-required":["Python","R","Machine learning","Statistical modelling","Data visualisation","Tableau","Power BI"],"x-skills-preferred":["Matplotlib","Seaborn","Data governance","Data quality"],"datePosted":"2026-03-07T20:07:48.601Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Grove, Oxfordshire"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Motorsport","skills":"Python, R, Machine learning, Statistical modelling, Data visualisation, Tableau, Power BI, Matplotlib, Seaborn, Data governance, Data quality"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_99247ee9-652"},"title":"Data Scientist","description":"<p>We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will play a key role in helping us to make data-driven decisions and drive business growth.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Work closely with the data engineering team to design, develop and maintain data pipelines and architectures</li>\n<li>Collaborate with cross-functional teams to identify business opportunities and develop data-driven solutions</li>\n<li>Develop and maintain machine learning models to drive business growth and improve customer experience</li>\n<li>Analyse large datasets to identify trends and insights that can inform business decisions</li>\n<li>Communicate complex data insights to non-technical stakeholders</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Bachelor&#39;s degree in Computer Science, Mathematics, Statistics or a related field</li>\n<li>2+ years of experience in data science or a related field</li>\n<li>Strong programming skills in languages such as Python, R or SQL</li>\n<li>Experience with machine learning libraries such as scikit-learn or TensorFlow</li>\n<li>Strong analytical and problem-solving skills</li>\n<li>Excellent communication and interpersonal skills</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salary and benefits package</li>\n<li>Opportunity to work with a leading Formula One team</li>\n<li>Collaborative and dynamic work environment</li>\n<li>Professional development and growth opportunities</li>\n<li>Access to cutting-edge technology and tools</li>\n<li>Flexible working hours and remote work options</li>\n</ul>\n<p>If you are a motivated and talented Data Scientist looking for a new challenge, please submit your application. We look forward to hearing from you!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_99247ee9-652","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Williams Racing","sameAs":"https://careers.williamsf1.com","logo":"https://logos.yubhub.co/careers.williamsf1.com.png"},"x-apply-url":"https://careers.williamsf1.com/job/simulation-delivery-manager-in-grove-wantage-jid-513","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","R","SQL","scikit-learn","TensorFlow","machine learning","data engineering","data pipelines","data architectures"],"x-skills-preferred":["data visualisation","data storytelling","data communication"],"datePosted":"2026-03-07T20:06:17.714Z","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Motorsport","skills":"Python, R, SQL, scikit-learn, TensorFlow, machine learning, data engineering, data pipelines, data architectures, data visualisation, data storytelling, data communication"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_255c8146-d03"},"title":"Data Scientist","description":"<p>We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will play a key role in analysing and interpreting complex data sets to inform our racing strategy and improve our performance on the track.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Work closely with our racing team to understand their needs and develop data-driven solutions to improve their performance</li>\n<li>Develop and maintain complex data models and algorithms to analyse and interpret large data sets</li>\n<li>Collaborate with our data engineering team to design and implement data pipelines and architectures</li>\n<li>Communicate complex technical information to non-technical stakeholders, including our racing team and senior management</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>A strong background in data science, including a degree in a relevant field such as mathematics, statistics, or computer science</li>\n<li>Proficiency in programming languages such as Python, R, or SQL</li>\n<li>Experience with data visualisation tools such as Tableau or Power BI</li>\n<li>Strong analytical and problem-solving skills, with the ability to work independently and as part of a team</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salary and benefits package</li>\n<li>Opportunity to work with a world-class racing team and contribute to their success</li>\n<li>Collaborative and dynamic work environment</li>\n<li>Access to cutting-edge technology and tools</li>\n<li>Professional development opportunities</li>\n</ul>\n<p>Note: The salary range for this role is competitive and will be discussed in more detail during the interview process.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_255c8146-d03","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Williams Racing","sameAs":"https://careers.williamsf1.com","logo":"https://logos.yubhub.co/careers.williamsf1.com.png"},"x-apply-url":"https://careers.williamsf1.com/job/indirect-procurement-business-partner-ftc-in-grove-wantage-jid-495","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"Competitive salary and benefits package","x-skills-required":["Python","R","SQL","Tableau","Power BI","Data visualisation","Data analysis","Data modelling","Algorithms"],"x-skills-preferred":["Machine learning","Deep learning","Data engineering","Cloud computing"],"datePosted":"2026-03-07T20:06:17.394Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Grove, Oxfordshire"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Motorsport","skills":"Python, R, SQL, Tableau, Power BI, Data visualisation, Data analysis, Data modelling, Algorithms, Machine learning, Deep learning, Data engineering, Cloud computing"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d308f15c-2ad"},"title":"Data Scientist","description":"<p>We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will play a key role in analysing and interpreting complex data sets to inform our racing strategy and improve our performance on the track.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Work closely with our racing team to understand their needs and develop data-driven solutions to improve their performance</li>\n<li>Develop and maintain complex data models and algorithms to analyse and interpret large data sets</li>\n<li>Collaborate with our engineering team to integrate data insights into our racing strategy</li>\n<li>Communicate complex data insights to non-technical stakeholders, including our racing team and senior management</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>A strong background in data science, including a degree in a relevant field such as mathematics, statistics, or computer science</li>\n<li>Proficiency in programming languages such as Python, R, or SQL</li>\n<li>Experience with data visualisation tools such as Tableau or Power BI</li>\n<li>Strong analytical and problem-solving skills, with the ability to interpret complex data sets and develop actionable insights</li>\n<li>Excellent communication and interpersonal skills, with the ability to work effectively with non-technical stakeholders</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salary and benefits package</li>\n<li>Opportunity to work with a highly skilled and experienced team</li>\n<li>Access to cutting-edge technology and resources</li>\n<li>Flexible working hours and remote working options</li>\n<li>Ongoing training and development opportunities</li>\n</ul>\n<p><strong>Salary</strong></p>\n<ul>\n<li>£60,000 - £80,000 per annum, depending on experience</li>\n</ul>\n<p><strong>Category</strong></p>\n<ul>\n<li>Engineering</li>\n</ul>\n<p><strong>Industry</strong></p>\n<ul>\n<li>Motorsport</li>\n</ul>\n<p><strong>Required Skills</strong></p>\n<ul>\n<li>Data science</li>\n<li>Machine learning</li>\n<li>Data visualisation</li>\n<li>SQL</li>\n<li>Python</li>\n<li>R</li>\n</ul>\n<p><strong>Preferred Skills</strong></p>\n<ul>\n<li>Tableau</li>\n<li>Power BI</li>\n<li>Data engineering</li>\n<li>Cloud computing</li>\n</ul>\n<p><strong>Experience Level</strong></p>\n<ul>\n<li>Mid</li>\n</ul>\n<p><strong>Employment Type</strong></p>\n<ul>\n<li>Full-time</li>\n</ul>\n<p><strong>Workplace Type</strong></p>\n<ul>\n<li>Hybrid</li>\n</ul>\n<p><strong>Category</strong></p>\n<ul>\n<li>Engineering</li>\n</ul>\n<p><strong>Industry</strong></p>\n<ul>\n<li>Motorsport</li>\n</ul>\n<p><strong>Salary Range</strong></p>\n<p>£60,000 - £80,000 per annum</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d308f15c-2ad","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Williams Racing","sameAs":"https://careers.williamsf1.com","logo":"https://logos.yubhub.co/careers.williamsf1.com.png"},"x-apply-url":"https://careers.williamsf1.com/job/no-1-mechanic-test-team-in-grove-wantage-jid-509","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"£60,000 - £80,000 per annum","x-skills-required":["Data science","Machine learning","Data visualisation","SQL","Python","R"],"x-skills-preferred":["Tableau","Power BI","Data engineering","Cloud computing"],"datePosted":"2026-03-07T20:05:47.344Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Grove, Oxfordshire"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Motorsport","skills":"Data science, Machine learning, Data visualisation, SQL, Python, R, Tableau, Power BI, Data engineering, Cloud computing","baseSalary":{"@type":"MonetaryAmount","currency":"GBP","value":{"@type":"QuantitativeValue","minValue":60000,"maxValue":80000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6e6d3e44-db8"},"title":"Data Scientist","description":"<p>We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will play a key role in analysing and interpreting complex data to inform business decisions and drive performance improvement.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Work closely with the data engineering team to design and implement data pipelines and data warehousing solutions</li>\n<li>Develop and maintain data visualisation tools and reports to support business decision-making</li>\n<li>Collaborate with cross-functional teams to identify business opportunities and develop data-driven solutions</li>\n<li>Conduct statistical analysis and machine learning modelling to inform business decisions</li>\n<li>Develop and maintain data quality and governance processes to ensure data accuracy and integrity</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Bachelor&#39;s degree in a quantitative field such as mathematics, statistics, or computer science</li>\n<li>Proven experience in data analysis and machine learning</li>\n<li>Strong programming skills in languages such as Python or R</li>\n<li>Experience with data visualisation tools such as Tableau or Power BI</li>\n<li>Excellent communication and interpersonal skills</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salary and benefits package</li>\n<li>Opportunity to work with a leading Formula One team</li>\n<li>Collaborative and dynamic work environment</li>\n<li>Professional development and training opportunities</li>\n<li>Access to state-of-the-art technology and tools</li>\n<li>Flexible working hours and remote working options</li>\n<li>Annual bonus scheme</li>\n<li>25 days&#39; annual leave</li>\n<li>Pension scheme</li>\n<li>Free on-site parking and meals</li>\n<li>Access to on-site gym and fitness classes</li>\n<li>Discounts on team merchandise and hospitality events</li>\n</ul>\n<p><strong>Preferred Qualifications</strong></p>\n<ul>\n<li>Master&#39;s degree in a quantitative field</li>\n<li>Experience with cloud-based data platforms such as AWS or GCP</li>\n<li>Experience with big data technologies such as Hadoop or Spark</li>\n<li>Certification in data science or machine learning</li>\n<li>Experience with data governance and quality processes</li>\n</ul>\n<p>If you are a motivated and talented Data Scientist looking for a new challenge, please submit your application. We look forward to hearing from you.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6e6d3e44-db8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Williams Racing","sameAs":"https://careers.williamsf1.com","logo":"https://logos.yubhub.co/careers.williamsf1.com.png"},"x-apply-url":"https://careers.williamsf1.com/job/test-and-validation-senior-test-engineer-in-grove-wantage-jid-395","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"Competitive salary and benefits package","x-skills-required":["Python","R","Tableau","Power BI","AWS","GCP","Hadoop","Spark","Data visualisation","Machine learning","Data governance","Data quality"],"x-skills-preferred":["Cloud-based data platforms","Big data technologies","Certification in data science or machine learning"],"datePosted":"2026-03-07T20:04:31.518Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Grove, Oxfordshire"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Motorsport","skills":"Python, R, Tableau, Power BI, AWS, GCP, Hadoop, Spark, Data visualisation, Machine learning, Data governance, Data quality, Cloud-based data platforms, Big data technologies, Certification in data science or machine learning"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_27800f00-a0b"},"title":"Data Scientist","description":"<p>We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will play a key role in helping us to make data-driven decisions and to drive innovation in our organisation.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Work closely with our engineering and racing teams to collect, analyse and interpret large datasets</li>\n<li>Develop and implement machine learning models to improve our understanding of racing performance and to identify areas for improvement</li>\n<li>Collaborate with our data engineers to design and implement data pipelines and to ensure the quality and integrity of our data</li>\n<li>Communicate complex technical information to non-technical stakeholders and to provide recommendations for business improvement</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>A degree in a quantitative subject such as mathematics, statistics or computer science</li>\n<li>Proven experience in data science and machine learning</li>\n<li>Strong programming skills in languages such as Python or R</li>\n<li>Experience with data visualisation tools such as Tableau or Power BI</li>\n<li>Excellent communication and interpersonal skills</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salary and benefits package</li>\n<li>Opportunity to work with a leading Formula One team</li>\n<li>Collaborative and dynamic working environment</li>\n<li>Professional development and training opportunities</li>\n<li>Access to state-of-the-art technology and equipment</li>\n<li>Flexible working hours and remote working options</li>\n<li>Annual bonus scheme</li>\n<li>25 days&#39; annual leave</li>\n<li>Access to on-site gym and other employee benefits</li>\n</ul>\n<p>If you are a motivated and talented individual who is passionate about data science and Formula One, we would love to hear from you.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_27800f00-a0b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Williams Racing","sameAs":"https://careers.williamsf1.com","logo":"https://logos.yubhub.co/careers.williamsf1.com.png"},"x-apply-url":"https://careers.williamsf1.com/job/test-and-validation-engineering-technician-in-grove-wantage-jid-374","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"Competitive salary and benefits package","x-skills-required":["Python","R","Machine learning","Data visualisation","Data engineering"],"x-skills-preferred":["Tableau","Power BI","SQL"],"datePosted":"2026-03-07T20:04:16.819Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Grove, Oxfordshire"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Motorsport","skills":"Python, R, Machine learning, Data visualisation, Data engineering, Tableau, Power BI, SQL"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_93c50f21-80e"},"title":"Strategic Risk Analyst","description":"<p><strong>Job Posting</strong></p>\n<p><strong>Strategic Risk Analyst</strong></p>\n<p><strong>Location</strong></p>\n<p>San Francisco</p>\n<p><strong>Employment Type</strong></p>\n<p>Full time</p>\n<p><strong>Location Type</strong></p>\n<p>Hybrid</p>\n<p><strong>Department</strong></p>\n<p>Intelligence &amp; Investigations</p>\n<p><strong>Compensation</strong></p>\n<ul>\n<li>$198K – $320K • Offers Equity</li>\n</ul>\n<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>\n<ul>\n<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>\n</ul>\n<ul>\n<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>\n</ul>\n<ul>\n<li>401(k) retirement plan with employer match</li>\n</ul>\n<ul>\n<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>\n</ul>\n<ul>\n<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>\n</ul>\n<ul>\n<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>\n</ul>\n<ul>\n<li>Mental health and wellness support</li>\n</ul>\n<ul>\n<li>Employer-paid basic life and disability coverage</li>\n</ul>\n<ul>\n<li>Annual learning and development stipend to fuel your professional growth</li>\n</ul>\n<ul>\n<li>Daily meals in our offices, and meal delivery credits as eligible</li>\n</ul>\n<ul>\n<li>Relocation support for eligible employees</li>\n</ul>\n<ul>\n<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>\n</ul>\n<p>More details about our benefits are available to candidates during the hiring process.</p>\n<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>\n<p><strong>About the team</strong></p>\n<p>The Intelligence and Investigations team seeks to rapidly identify and mitigate abuse and strategic risks to ensure a safe online ecosystem. We are dedicated to identifying emerging abuse trends, analysing risks, and working with our internal and external partners to implement effective mitigation strategies to protect against misuse. Our efforts contribute to OpenAI&#39;s overarching goal of developing AI that benefits humanity.</p>\n<p>We are building a horizontal “radar” for AI abuse and strategic risk—correlating internal signals, external intelligence, and real-world events into clear, actionable priorities for OpenAI’s safety and product decision-makers.</p>\n<p><strong>About the role</strong></p>\n<p>As a Strategic Risk Analyst, you will help develop and maintain our central view of strategic risk across OpenAI’s products and platforms. You will synthesise internal abuse patterns, upstream and external intelligence, and product and conversational signals into decision-ready risk insights, recurring briefs, and practical prioritisation inputs</p>\n<p>You will partner closely with investigators, engineers, and policy and trust and safety counterparts, as well as measurement and forecasting teammates, to translate messy signals into structured judgments (including assumptions and confidence), ranked priorities, and actionable recommendations. This is an opportunity to do high-leverage analysis in a fast-moving environment, where crisp thinking and communication directly shape safety decisions, mitigations, and product readiness.</p>\n<p><strong>In this role, you will</strong></p>\n<ul>\n<li>Monitor and analyse internal risk signals (abuse telemetry, investigations outputs, model and product signals) to identify trends, shifts in tactics, and new abuse patterns.</li>\n</ul>\n<ul>\n<li>Conduct upstream and external scanning (OSINT, ecosystem developments, real-world events) and distil implications for OpenAI’s products and threat landscape.</li>\n</ul>\n<ul>\n<li>Identify and deep dive into harms and misuse across products and channels, turning messy signals into clear analytic findings.</li>\n</ul>\n<ul>\n<li>Connect individual incidents into system-level narratives about actors, incentives, product design weaknesses, and cross-product spillover—pressure-testing hypotheses early.</li>\n</ul>\n<ul>\n<li>Produce concise, decision-ready risk briefs and intelligence estimates with explicit assumptions, confidence levels, and what would change the assessment.</li>\n</ul>\n<ul>\n<li>Convert analysis into clear, ranked priorities and actionable recommendations that product, safety, and policy teams can execute on.</li>\n</ul>\n<ul>\n<li>Define and track key risk indicators and outcome metrics to evaluate whether mitigations are working and drive course corrections when needed.</li>\n</ul>\n<ul>\n<li>Build early-warning and monitoring capabilities with data, engineering, and visualisation partners, including dashboards that highlight leading indicators and unusual changes.</li>\n</ul>\n<ul>\n<li>Contribute to product readiness and launch reviews; develop reusable playbooks, FAQs, and briefing materials that help teams respond consistently.</li>\n</ul>\n<ul>\n<li>Drive cross-functional alignment by tailoring readouts to investigations, engineering, policy, trust and safety, and product stakeholders—and ensuring decisions and follow-ups are crisp.</li>\n</ul>\n<p><strong>You might thrive in this role if you</strong></p>\n<ul>\n<li>Significant experience (typically <strong>5+ years</strong>) in trust and safety, integrity, security, policy analysis, or intelligence work.</li>\n</ul>\n<ul>\n<li>Demonstrated ability to analyse complex online harms and AI-enabled misuse (e.g., harassment, coordinated abuse, scams, synthetic media, influence operations, brand safety issues) and convert analysis into concrete, prioritised recommendations.</li>\n</ul>\n<ul>\n<li>Strong analytical craft: you can identify weak signals, form hypotheses, test them quickly, state assumptions explicitly, and communicate confidence and uncertainty clearly.</li>\n</ul>\n<ul>\n<li>Comfort working across qualitative and quantitative inputs, including (1) casework,</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_93c50f21-80e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"OpenAI","sameAs":"https://jobs.ashbyhq.com","logo":"https://logos.yubhub.co/openai.com.png"},"x-apply-url":"https://jobs.ashbyhq.com/openai/d821a725-671f-4327-b918-9be90ef7be45","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$198K – $320K • Offers Equity","x-skills-required":["trust and safety","integrity","security","policy analysis","intelligence work","online harms","AI-enabled misuse","harassment","coordinated abuse","scams","synthetic media","influence operations","brand safety issues"],"x-skills-preferred":["data analysis","data visualisation","machine learning","natural language processing","software development"],"datePosted":"2026-03-06T18:42:41.351Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"trust and safety, integrity, security, policy analysis, intelligence work, online harms, AI-enabled misuse, harassment, coordinated abuse, scams, synthetic media, influence operations, brand safety issues, data analysis, data visualisation, machine learning, natural language processing, software development","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":198000,"maxValue":320000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_92fce934-b53"},"title":"Data Scientist","description":"<p><strong>Apply now!  First name\\<em>  Last name\\</em>  Email address\\<em>  Which career you want to apply to?\\</em>  Message\\<em>  + FP  Read more  ## Job Description  We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will be responsible for analysing large datasets to gain insights and inform our racing strategy.  ### Responsibilities  </em> Develop and implement data analysis and machine learning models to improve our racing performance <em> Work closely with our engineering and racing teams to understand their needs and develop solutions </em> Collaborate with our data engineers to design and implement data pipelines and architectures <em> Develop and maintain data visualisation tools to communicate insights to our teams </em> Stay up-to-date with the latest developments in data science and machine learning <em> Work with our data engineers to ensure data quality and integrity </em> Develop and maintain data documentation and standards <em> Collaborate with our racing teams to develop and implement data-driven strategies </em> Work with our data engineers to develop and implement data-driven decision-making tools <em> Develop and maintain data visualisation tools to communicate insights to our teams </em> Stay up-to-date with the latest developments in data science and machine learning <em> Work with our data engineers to ensure data quality and integrity </em> Develop and maintain data documentation and standards  ### Requirements  <em> Bachelor&#39;s degree in Computer Science, Mathematics, Statistics, or a related field </em> 2+ years of experience in data science or a related field <em> Strong programming skills in Python, R, or SQL </em> Experience with machine learning libraries such as scikit-learn, TensorFlow, or PyTorch <em> Experience with data visualisation tools such as Matplotlib, Seaborn, or Plotly </em> Strong understanding of statistical concepts and techniques <em> Experience with data engineering and data architecture </em> Strong communication and collaboration skills <em> Ability to work in a fast-paced environment  ### Benefits  </em> Competitive salary and benefits package <em> Opportunity to work with a professional motorsport organisation </em> Collaborative and dynamic work environment <em> Opportunities for professional growth and development </em> Access to cutting-edge technology and tools * Flexible working hours and remote work options  ## How to Apply  If you are a motivated and talented Data Scientist looking for a new challenge, please submit your application, including your resume and a cover letter, to [insert contact information]. We look forward to hearing from you!</strong></p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_92fce934-b53","directApply":true,"hiringOrganization":{"@type":"Organization","name":"W Racing Team","sameAs":"https://www.w-racingteam.com","logo":"https://logos.yubhub.co/w-racingteam.com.png"},"x-apply-url":"https://www.w-racingteam.com/manufacturing/careers/mécano","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"Competitive salary and benefits package","x-skills-required":["Python","R","SQL","Machine learning","Data visualisation","Statistical concepts","Data engineering","Data architecture"],"x-skills-preferred":["TensorFlow","PyTorch","Matplotlib","Seaborn","Plotly"],"datePosted":"2026-03-06T14:29:05.633Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Monza"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Motorsport","skills":"Python, R, SQL, Machine learning, Data visualisation, Statistical concepts, Data engineering, Data architecture, TensorFlow, PyTorch, Matplotlib, Seaborn, Plotly"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d855dae9-6d1"},"title":"Head of Analytical Leads","description":"<p><strong>Summary</strong></p>\n<p>Microsoft AI are looking for a talented Head of Analytical Leads at their Sydney office. This role sits at the heart of strategic decision-making, turning market data into actionable insights for a company that&#39;s revolutionising digital advertising technology. You&#39;ll work directly with leadership to shape the company&#39;s direction in the digital advertising market.</p>\n<p><strong>About the Role</strong></p>\n<p>As a Head of Analytical Leads, you will manage a team of senior analytical leaders across APAC, delivering insights and recommendations that improve client performance and unlock new investment opportunities on the Microsoft Advertising platform. You will combine industry knowledge, digital marketing expertise, and advanced analytics to craft compelling, data-driven narratives that influence senior decision-makers through clear storytelling and visualisation.</p>\n<p><strong>Accountabilities</strong></p>\n<ul>\n<li>Conduct in-depth market research across digital advertising sectors, identifying emerging trends, competitive threats, and partnership opportunities that directly inform the company&#39;s quarterly strategic planning sessions</li>\n<li>Ensure team members evaluate the sufficiency of data between internal and external sources, drives accountability of identification and resolution of data quality issues, and encourages and coordinates cross-team collaboration and data sharing to build data pipelines or integrations</li>\n</ul>\n<p><strong>The Candidate we&#39;re looking for</strong></p>\n<p><strong>Experience:</strong></p>\n<ul>\n<li>5+ years of experience in a leadership role, preferably in a digital advertising or analytics context</li>\n</ul>\n<p><strong>Technical skills:</strong></p>\n<ul>\n<li>Advanced analytics skills, including data visualisation and storytelling</li>\n<li>Strong understanding of digital marketing principles and practices</li>\n</ul>\n<p><strong>Personal attributes:</strong></p>\n<ul>\n<li>Proactive and challenger mindset</li>\n<li>Strong people leadership and coaching skills</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salary and bonus structure</li>\n<li>Comprehensive health and wellbeing benefits</li>\n<li>Professional development opportunities</li>\n<li>Flexible work arrangements, including remote and hybrid options</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d855dae9-6d1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Microsoft AI","sameAs":"https://microsoft.ai","logo":"https://logos.yubhub.co/microsoft.ai.png"},"x-apply-url":"https://microsoft.ai/job/head-of-analytical-leads-sydney/","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"Competitive salary and bonus structure","x-skills-required":["advanced analytics","data visualisation","digital marketing","leadership","coaching"],"x-skills-preferred":["project management","communication","problem-solving"],"datePosted":"2026-03-06T07:25:08.188Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Sydney"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"advanced analytics, data visualisation, digital marketing, leadership, coaching, project management, communication, problem-solving"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4729a3bc-4d7"},"title":"Product Data Scientist, Search Quality","description":"<p>We are looking for a Product Data Scientist to join our team. As a Product Data Scientist, you will be responsible for developing data-driven insights from user behaviour to inform our product roadmap and accelerate adoption.</p>\n<p><strong>What you&#39;ll do</strong></p>\n<p>As a Product Data Scientist, you will be responsible for the following tasks:</p>\n<ul>\n<li>Developing data-driven insights from user behaviour to inform our product roadmap and accelerate adoption</li>\n</ul>\n<ul>\n<li>Formulating hypotheses and validating them by designing, running, and analysing A/B tests</li>\n</ul>\n<ul>\n<li>Determining appropriate metrics and visualisations for tracking, and implementing them in dashboards</li>\n</ul>\n<ul>\n<li>Designing new pipelines that will help to deliver better ranking quality</li>\n</ul>\n<p><strong>What you need</strong></p>\n<p>To be successful in this role, you will need the following skills:</p>\n<ul>\n<li>4+ years of experience working as a data analyst or in a related role</li>\n</ul>\n<ul>\n<li>Experience working on search-related products, with emphasis on designing online metrics and analysing A/B experiments</li>\n</ul>\n<ul>\n<li>Strong Python skills (expected to write production-grade code)</li>\n</ul>\n<ul>\n<li>Proficiency with SQL</li>\n</ul>\n<ul>\n<li>Experience with Business Intelligence (BI) tools</li>\n</ul>\n<p><strong>Why this matters</strong></p>\n<p>As a Product Data Scientist, you will have the opportunity to work on complex problems and develop innovative solutions. You will be part of a team of experts who are passionate about creating innovative solutions to complex problems.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4729a3bc-4d7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Perplexity","sameAs":"https://jobs.ashbyhq.com","logo":"https://logos.yubhub.co/perplexity.com.png"},"x-apply-url":"https://jobs.ashbyhq.com/perplexity/a805e14b-061d-469c-9136-b9e6a1855902","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data analysis","A/B testing","data visualisation","Python","SQL","Business Intelligence"],"x-skills-preferred":["Apache Spark","Databricks","LLM-as-a-judge systems"],"datePosted":"2026-03-04T12:27:21.483Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Belgrade, Berlin, London"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data analysis, A/B testing, data visualisation, Python, SQL, Business Intelligence, Apache Spark, Databricks, LLM-as-a-judge systems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_009e88c6-235"},"title":"Data Scientist","description":"<p>We are seeking a highly skilled data scientist to join our team. As a data scientist, you will be responsible for collecting, analysing and interpreting complex data to inform business decisions and drive performance improvement.</p>\n<p><strong>What you&#39;ll do</strong></p>\n<ul>\n<li>Develop and implement data-driven solutions to drive business performance</li>\n<li>Collaborate with cross-functional teams to identify and prioritise data-driven opportunities</li>\n<li>Design and implement data visualisation tools to communicate insights to stakeholders</li>\n</ul>\n<p><strong>What you need</strong></p>\n<ul>\n<li>Strong analytical and problem-solving skills</li>\n<li>Experience with data visualisation tools such as Tableau or Power BI</li>\n<li>Proficiency in programming languages such as Python or R</li>\n</ul>\n<p><strong>Why this matters</strong></p>\n<p>As a data scientist at Williams F1 Team, you will have the opportunity to work on high-profile projects and contribute to the team&#39;s success. You will be part of a dynamic and innovative team that is passionate about using data to drive performance improvement.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_009e88c6-235","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Williams F1 Team","sameAs":"https://careers.williamsf1.com","logo":"https://logos.yubhub.co/careers.williamsf1.com.png"},"x-apply-url":"https://careers.williamsf1.com/job/hr-administrator-coordinator-in-grove-wantage-jid-436","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"Competitive salary and benefits package","x-skills-required":["data analysis","data visualisation","programming"],"x-skills-preferred":["machine learning","statistics"],"datePosted":"2026-02-20T00:03:11.727Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Oxfordshire"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Motorsport","skills":"data analysis, data visualisation, programming, machine learning, statistics"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_24e8cd49-bf1"},"title":"Playtest Analyst, Battlefield QV","description":"<p>We are looking for a Playtest Analyst who is passionate about data and data visualisation. Our mission is building a better game through objective analysis, and you will be challenged with unearthing and using quantitative data to inform insights for the product team.</p>\n<p><strong>What you&#39;ll do</strong></p>\n<p>You will partner with other data specialists to form the data strategy for the playtesting team. You will collaborate with designers &amp; development partners to identify important questions and the data to answer them. You will gain a deep understanding of Battlefield game telemetry. You will create data sets and analyse them using exploratory techniques to identify insights. You will produce compelling visualisations that support assessments of quality. You will communicate insights to designers, product managers, and others.</p>\n<p><strong>What you need</strong></p>\n<ul>\n<li>Bachelor&#39;s degree or equivalent professional experience.</li>\n<li>At least 3 years of relevant experience.</li>\n<li>Experience with a variety of data visualisation tools.</li>\n<li>Experience with SQL and Python.</li>\n<li>Experience with game telemetry or similar.</li>\n<li>A player of first person shooter games.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_24e8cd49-bf1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Electronic Arts","sameAs":"https://jobs.ea.com","logo":"https://logos.yubhub.co/jobs.ea.com.png"},"x-apply-url":"https://jobs.ea.com/en_US/careers/JobDetail/Playtest-Analyst-Battlefield-QV/211841","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data visualisation","SQL","Python","game telemetry"],"x-skills-preferred":["data analysis","data science","data engineering"],"datePosted":"2026-01-23T06:06:16.329Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Guildford"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data visualisation, SQL, Python, game telemetry, data analysis, data science, data engineering"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1566c412-a9e"},"title":"Junior Data Analyst","description":"<p>We are seeking a highly motivated and detail-oriented Junior Data Analyst to join our team. As a Junior Data Analyst, you will be responsible for collecting, analysing and interpreting data to help us make informed decisions.</p>\n<p><strong>What you&#39;ll do</strong></p>\n<ul>\n<li>Collect and analyse data from various sources, including sensors, GPS and telemetry systems</li>\n<li>Develop and maintain databases and data visualisation tools to help us understand and interpret the data</li>\n<li>Work closely with the engineering and racing teams to identify areas for improvement and develop strategies to address them</li>\n</ul>\n<p><strong>What you need</strong></p>\n<ul>\n<li>Strong analytical and problem-solving skills</li>\n<li>Proficiency in data analysis software, including Excel, SQL and data visualisation tools</li>\n<li>Experience with programming languages, including Python and R</li>\n<li>Strong communication and interpersonal skills</li>\n</ul>\n<p><strong>Why this matters</strong></p>\n<p>As a Junior Data Analyst at M-Sport, you will play a critical role in helping us to improve our performance and achieve our goals. You will have the opportunity to work with a talented team of engineers and racing drivers, and to contribute to the development of new technologies and strategies. This is an exciting opportunity for anyone who is passionate about motorsport and data analysis, and who is looking to start their career in a dynamic and fast-paced industry.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1566c412-a9e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"M-Sport","sameAs":"https://www.m-sport.co.uk","logo":"https://logos.yubhub.co/m-sport.co.uk.png"},"x-apply-url":"https://www.m-sport.co.uk/single-post/cards-on-the-table-for-season-opener","x-work-arrangement":"onsite","x-experience-level":"entry","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data analysis","data visualisation","programming","communication"],"x-skills-preferred":["SQL","Python","R"],"datePosted":"2025-12-20T09:15:33.646Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"M-Sport Village, Dovenby, Cockermouth, Cumbria, CA13 0PB"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Motorsport","skills":"data analysis, data visualisation, programming, communication, SQL, Python, R"}]}