{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/analytics-engineering"},"x-facet":{"type":"skill","slug":"analytics-engineering","display":"Analytics Engineering","count":13},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a1f35a9c-2e5"},"title":"Staff Data Scientist","description":"<p>We are looking for a Staff Product Data Scientist to join our Data &amp; Insights team. This role will lead the strategy, development, and operationalization of advanced analytics and machine learning solutions that power data intelligence for Okta’s Product Management teams, uncovering actionable insights on customer behavior, product engagement, and opportunities that will drive our product growth strategy.</p>\n<p>You’ll define the long-term data science roadmap while partnering closely with Okta’s diverse Product Management teams and executive leadership.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Collaborate closely with Product Managers and leadership to provide insights that will help drive product strategy and roadmaps, ensuring decisions are grounded in a data-driven approach.</li>\n<li>Build and deploy statistical and machine learning models (e.g., predicting churn, lifetime value, feature adoption) to forecast user behavior and product growth opportunities in order to help influence roadmap decisions.</li>\n<li>Conduct thorough exploratory analysis on extensive, complex datasets to uncover key drivers of user adoption and engagement, identifying unseen opportunities for significant product improvement.</li>\n<li>Work with data engineering, analytics engineering, and data analysts to shape and enhance the foundational data infrastructure necessary for scalable ML and advanced analytics initiatives.</li>\n<li>Develop and construct usable data sets by integrating and manipulating information from various disparate data sources as needed.</li>\n<li>Translate intricate data insights into clear, compelling narratives for executives, product managers, and engineers, effectively influencing crucial business and product decisions.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>7+ years of experience in data science or ML, including 3–5 years in a senior, staff, or technical leadership role.</li>\n<li>Deep understanding of product analytics and data science frameworks, with a proven history of designing and analyzing product data to solve complex, ambiguous business problems and deliver measurable results.</li>\n<li>Demonstrated ability to apply cutting-edge AI tools to accelerate the discovery of deep, actionable insights from complex product data.</li>\n<li>Demonstrated ability to translate data insights into product impact and formulate strategic, data-driven recommendations.</li>\n<li>Deep expertise in machine learning algorithms (supervised, unsupervised, NLP, forecasting, optimization) and statistical modeling.</li>\n<li>Strong proficiency in Python, SQL, and leading ML libraries.</li>\n<li>Excellent communication skills and a proven ability to influence Product and executive partners.</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Experience in high-growth SaaS, cybersecurity, identity, or enterprise software environments.</li>\n<li>Prior ownership of ML platform/tooling decisions and evaluations.</li>\n<li>Experience enabling self-service analytics or citizen data science capabilities.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a1f35a9c-2e5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Okta","sameAs":"https://www.okta.com/","logo":"https://logos.yubhub.co/okta.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/okta/jobs/7731595","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$184,000-$253,000 USD","x-skills-required":["data science","machine learning","Python","SQL","statistical modeling","data engineering","analytics engineering","data analysts"],"x-skills-preferred":["high-growth SaaS","cybersecurity","identity","enterprise software"],"datePosted":"2026-04-18T15:58:37.181Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bellevue, Washington; San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"data science, machine learning, Python, SQL, statistical modeling, data engineering, analytics engineering, data analysts, high-growth SaaS, cybersecurity, identity, enterprise software","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":184000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_43334479-97e"},"title":"Sr Analytics Engineer - GTM Strategy and Operations","description":"<p>As a Senior Analytics Engineer, you will be a critical partner to the Global GTM Strategy &amp; Operations teams, providing the data, AI-driven insights, and infrastructure needed to drive efficiency and effectiveness across the organization.</p>\n<p>You will design, build, and maintain scalable data models, curated reporting tables, forecasts, and dashboards that support everyone from senior executives to individual contributors, empowering them to make informed decisions and spend more time driving customer outcomes.</p>\n<p>Working closely with cross-functional stakeholders,including Sales, Finance, Marketing, and other data teams,you will tackle complex data challenges by leveraging structured data, building AI-powered querying assistants, and using tools like Databricks Genie to improve data accessibility, streamline insights, and deliver actionable, reliable solutions across the business.</p>\n<p>You will also play a key role in advancing our newly created AI initiatives and semantic data curation efforts, helping to establish a strong foundation for advanced analytics, automation, and scalable business intelligence.</p>\n<p>The Impact You Will Have:</p>\n<ul>\n<li>Build: You will design and develop analytic tools, including a semantic layer for AI use cases, scalable data models, curated tables, and insightful analyses that empower thousands of field employees and leaders worldwide.</li>\n</ul>\n<ul>\n<li>Architect: You will both manage the requirements gathering and lead execution of strategic analytic projects.</li>\n</ul>\n<ul>\n<li>Scale: You will build and manage relationships with stakeholders across the company but primarily with the GTM strategy and operations team</li>\n</ul>\n<p>What we look for:</p>\n<ul>\n<li>You have 4+ years of experience working as an Analyst / Data Engineer / Analytics Engineer with B2B sales, marketing, or finance data (GTM experience highly preferred).</li>\n</ul>\n<ul>\n<li>You are data-savvy with 3+ years of SQL and 2+ years of Python experience. Familiarity with data ecosystems and BI tools (e.g., Databricks, PowerBI) is required.</li>\n</ul>\n<ul>\n<li>You have built for scale. You have experience building scalable and productionizable data models with best practices in mind.</li>\n</ul>\n<ul>\n<li>You integrate AI into your daily workflow. You have hands-on experience using large language model tools (such as Claude or similar) to accelerate analytics work , from drafting and debugging code to synthesizing requirements and generating documentation.</li>\n</ul>\n<ul>\n<li>You&#39;re comfortable evaluating AI-generated outputs critically and iterating quickly.</li>\n</ul>\n<ul>\n<li>You are passionate about applying AI to transform GTM teams. You bring experience in delivering AI-driven solutions and have the ability to design innovative use cases as well as structure data models and tables that are optimized for AI readiness.</li>\n</ul>\n<ul>\n<li>You excel in partnering with the business, understanding the impact of your work on GTM, and creating innovative solutions.</li>\n</ul>\n<ul>\n<li>You have a track record of cross-functional collaboration and strong stakeholder relationships.</li>\n</ul>\n<ul>\n<li>You excel in a collaborative environment. You translate team member needs into clear tasks and deliverables for contributors.</li>\n</ul>\n<ul>\n<li>You work through dependencies, bottlenecks, and tradeoffs with ease.</li>\n</ul>\n<ul>\n<li>You have a service-oriented mindset.</li>\n</ul>\n<ul>\n<li>You are curious, creative, and kind.</li>\n</ul>\n<p>Pay Range Transparency:</p>\n<p>Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents the expected salary range for non-commissionable roles or on-target earnings for commissionable roles.</p>\n<p>Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location.</p>\n<p>Based on the factors above, Databricks anticipates utilizing the full width of the range.</p>\n<p>The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.</p>\n<p>For more information regarding which range your location is in visit our page here.</p>\n<p>Zone 1 Pay Range $133,000-$182,950 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_43334479-97e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Databricks","sameAs":"https://databricks.com","logo":"https://logos.yubhub.co/databricks.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/databricks/jobs/8479036002","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$133,000-$182,950 USD","x-skills-required":["SQL","Python","Databricks","PowerBI","Data Engineering","Analytics Engineering","AI","Machine Learning"],"x-skills-preferred":["Large Language Model Tools","Claude","Semantic Data Curation","Advanced Analytics","Automation","Scalable Business Intelligence"],"datePosted":"2026-04-18T15:58:18.439Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York; San Francisco, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, Databricks, PowerBI, Data Engineering, Analytics Engineering, AI, Machine Learning, Large Language Model Tools, Claude, Semantic Data Curation, Advanced Analytics, Automation, Scalable Business Intelligence","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":133000,"maxValue":182950,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_84875ccd-0a5"},"title":"Senior Partner Marketing Manager","description":"<p>As a Senior Partner Marketing Manager, you&#39;ll play a critical role in shaping and executing co-marketing initiatives with some of our most important technology partners globally.</p>\n<p>Your work will amplify the reach and impact of dbt across the modern data stack, helping to elevate our brand and drive growth through the ecosystem.</p>\n<p>This is a highly cross-functional role where strategic thinking, creativity, and strong collaboration skills will be key to success.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Lead the creation and execution of marketing campaigns, programs, events, and activities with strategic technology partners.</li>\n<li>Collaborate closely with the Revenue Marketing, Partnerships, and Product Marketing teams to ensure GTM partner plans align with dbt Labs&#39; broader business goals.</li>\n<li>Build and nurture relationships with marketing counterparts at key partners like Snowflake, Google, AWS, Microsoft, and Databricks to align on co-marketing efforts and shared objectives.</li>\n<li>Clearly articulate the value of dbt to partners and support them in promoting the platform internally and to their customer base.</li>\n<li>Own the development of joint messaging and co-branded assets,including blogs, webinars, solution briefs, and presentation decks,ensuring alignment and consistency across all public-facing content.</li>\n<li>Create internal enablement materials to educate and empower sales teams to leverage partner campaigns and initiatives.</li>\n<li>Gain a deep understanding of partner business strategies and priorities; design co-marketing programs that provide mutual value.</li>\n<li>Set and manage OKRs, track program performance, and deliver quarterly reviews with partners to assess impact, identify opportunities, and ensure strategic alignment.</li>\n<li>Develop annual GTM marketing plans tailored to individual partners, accounting for geographic and vertical-specific nuances.</li>\n<li>Exercise strategic judgment in deciding which partner activities to pursue and how best to allocate time and resources for maximum impact.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>8+ years of experience in B2B marketing or similar, particularly within the data or software industry, with a strong track record of building and executing successful partner marketing programs.</li>\n<li>A &#39;builder&#39; mindset: you enjoy solving problems, creating structure where none exists, and working cross-functionally to drive measurable outcomes.</li>\n<li>Deep familiarity with the modern data ecosystem and players such as Snowflake, Google, AWS, Microsoft, and Databricks.</li>\n<li>The ability to navigate complex partner organizations and manage relationships with multiple stakeholders across competing interests.</li>\n<li>Strong storytelling and positioning skills,you know how to distill joint value propositions into compelling messaging and content.</li>\n<li>Comfort operating in a fast-paced, dynamic environment with high levels of ambiguity.</li>\n<li>A broad understanding of integrated marketing strategies, including digital campaigns, field marketing, and industry events.</li>\n<li>Exceptional communication skills, including concise writing and confident presentation abilities, especially with senior stakeholders.</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Experience working asynchronously within a remote, distributed team.</li>\n<li>Prior experience working for or closely with any of dbt Labs&#39; strategic partners.</li>\n<li>Familiarity with the role dbt Labs plays in the cloud data warehouse ecosystem and the modern data stack.</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Unlimited vacation time with a culture that actively encourages time off</li>\n<li>401k plan with 3% guaranteed company contribution</li>\n<li>Comprehensive healthcare coverage</li>\n<li>Generous paid parental leave</li>\n<li>Flexible stipends for:</li>\n<li>Health &amp; Wellness</li>\n<li>Home Office Setup</li>\n<li>Cell Phone &amp; Internet</li>\n<li>Learning &amp; Development</li>\n<li>Office Space</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>We offer competitive compensation packages commensurate with experience, including salary, RSUs, and where applicable, performance-based pay. Our Talent Acquisition Team can answer questions around dbt Labs total rewards during your interview process.</p>\n<p>In select locations (including Austin, Boston, Chicago, Denver, Los Angeles, Philadelphia, New York City, San Francisco, Washington, DC, and Seattle), an alternate range may apply, as specified below.</p>\n<ul>\n<li>The typical starting salary range for this role is: $132,000-$188,700</li>\n<li>The typical starting salary range for this role in the select locations listed is: $147,000-209,000</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_84875ccd-0a5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673163005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$132,000-$188,700","x-skills-required":["B2B marketing","Partner marketing","Digital campaigns","Field marketing","Industry events","Cloud data warehouse ecosystem","Modern data stack","Data engineering","Analytics engineering","Snowflake","Google","AWS","Microsoft","Databricks"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:55:34.081Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Marketing","industry":"Technology","skills":"B2B marketing, Partner marketing, Digital campaigns, Field marketing, Industry events, Cloud data warehouse ecosystem, Modern data stack, Data engineering, Analytics engineering, Snowflake, Google, AWS, Microsoft, Databricks","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":132000,"maxValue":188700,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9ae9db20-be4"},"title":"Technical Instructor","description":"<p>We&#39;re seeking a Technical Instructor with a passion for teaching and working with data to join our training team to develop curriculum and deliver instruction focused on dbt. As a Technical Instructor, you will deliver live, world-class instruction to train and onboard dbt Cloud customers, partners, and GSIs in small groups, large groups, and webinar audiences. You will create an engaging learning environment initially in a remote context and likely in person in the future. You will get learners excited about using dbt Cloud to make an impact at their organization. You will clearly teach and demo new concepts and skills for learners. You will facilitate live co-development sessions where learners apply what they have learned. You will adjust instruction on the fly while focusing on learner outcomes. You will provide critical feedback from your classroom experience to improve curriculum changes. You will become a product expert with dbt in the context of the modern data stack. You will build curriculum independently. You will gather and implement feedback and self-review teaching.</p>\n<p>To be successful in this role, you will need a Bachelor&#39;s degree in a related field such as Computer Science, Data Analytics, Education, or similar. You will also need 2-4 years of technical instruction or related experience. You will love teaching and creating those lightbulb moments for learners. You will create learning environments with high levels of engagement. You will be laser-focused on learner and customer outcomes while adjusting instruction on the fly. You will believe teaching is a craft that we can always get better at and actively seek out feedback. You will communicate clearly and concisely with internal and external stakeholders. You will thrive in an environment of cross-collaboration that moves quickly. You will have experience developing curricula and shipping courses fast.</p>\n<p>What will make you stand out? You have worked on customer education/training teams and know how training can drive outcomes for customers. You have experience using dbt and/or teaching dbt. You have experience writing analytics code (i.e., Python, R, etc.) in addition to SQL and working with databases. You have experience designing curricula with a focus on backwards design. You have a dbt Fundamentals badge.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9ae9db20-be4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4667068005","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$73,000 - $88,200 USD","x-skills-required":["dbt","data analytics","education","curriculum development","instructional design","teaching","learning and development"],"x-skills-preferred":["product expertise","customer education","training","analytics engineering","data science"],"datePosted":"2026-04-18T15:55:13.917Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US East - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"dbt, data analytics, education, curriculum development, instructional design, teaching, learning and development, product expertise, customer education, training, analytics engineering, data science","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":73000,"maxValue":88200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d0db29d2-16c"},"title":"Senior Data Quality Analyst","description":"<p>We are seeking a Senior Data Quality Analyst to join our Data Products team. As a Senior Data Quality Analyst, you will be responsible for ensuring the accuracy and quality of our data outputs. This includes designing and running pre/post-release comparisons across key attributes, identifying and documenting issues missed by automated tests, and making disposition recommendations for each issue.</p>\n<p>In addition to data output validation, you will also be responsible for bug investigation and root cause analysis. This will involve reviewing and prioritizing DPQ Jira issues, querying Snowflake to trace anomalies to source, and producing clear reports outlining the issue, evidence, likely cause, and next steps for both technical and non-technical audiences.</p>\n<p>You will also be responsible for preparing the weekly publication review package, which includes compiling a weekly record of which data pipelines ran, their completion status, any anomalies in run time or output volume, and a comparison against expected behavior.</p>\n<p>The ideal candidate will have 6+ years of experience in data quality, data analysis, or analytics engineering, preferably in healthcare or a related field. They will also have strong SQL skills, experience with Snowflake, and the ability to work through ambiguous issues from signal to root cause.</p>\n<p>This is a senior-level role that requires a high degree of autonomy and independence. The successful candidate will be able to work effectively in a fast-paced environment and prioritize multiple tasks and projects simultaneously.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d0db29d2-16c","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Komodo Health","sameAs":"https://www.komodohealth.com/","logo":"https://logos.yubhub.co/komodohealth.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/komodohealth/jobs/8476216002","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["data quality","data analysis","analytics engineering","SQL","Snowflake","bug investigation","root cause analysis"],"x-skills-preferred":["Python","Gemini","Claude","Cursor"],"datePosted":"2026-04-18T15:54:47.018Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York, NY"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"data quality, data analysis, analytics engineering, SQL, Snowflake, bug investigation, root cause analysis, Python, Gemini, Claude, Cursor"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_58a44dab-91a"},"title":"Partner Solutions Architect - Japan","description":"<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across Japan. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>You will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud.</p>\n<p>Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing. This is not a purely reactive enablement role. The Partner SA is expected to help shape and execute repeatable partner plays that create revenue.</p>\n<p>That includes enabling partner sellers and architects, supporting account mapping and seller-to-seller engagement, helping define joint value propositions, supporting partner-led pipeline generation, and influencing product and field strategy based on what is learned in-market.</p>\n<p>Internal operating docs show this motion consistently includes enablement sessions, QBR sponsorships, account planning, workshops, field events, and targeted campaigns designed to produce sourced and influenced pipeline.</p>\n<p>You&#39;ll be part of a team helping dbt scale its ecosystem through better partner capability, tighter field alignment, and more repeatable pipeline generation. The role is especially important as dbt continues investing in structured partner motions and deeper engagement with major cloud and data platform partners.</p>\n<p>What you&#39;ll do:</p>\n<ul>\n<li>Partner closely with Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n</ul>\n<ul>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n</ul>\n<ul>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n</ul>\n<ul>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n</ul>\n<ul>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n</ul>\n<ul>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n</ul>\n<ul>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n</ul>\n<ul>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n</ul>\n<ul>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n</ul>\n<ul>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n</ul>\n<ul>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<ul>\n<li>Travel approximately 30-40% to support partner planning, enablement, executive meetings, and field events across Japan</li>\n</ul>\n<p>This scope reflects how the Partner SA team is already operating: enabling partner field teams, building account-level alignment, supporting QBRs and regional events, and translating those activities into sourced and engaged pipeline.</p>\n<p>What you&#39;ll need:</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n</ul>\n<ul>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n</ul>\n<ul>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n</ul>\n<ul>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n</ul>\n<ul>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n</ul>\n<ul>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n</ul>\n<ul>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n</ul>\n<ul>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n</ul>\n<ul>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n</ul>\n<ul>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out:</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n</ul>\n<ul>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n</ul>\n<ul>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n</ul>\n<ul>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n</ul>\n<ul>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n</ul>\n<ul>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n</ul>\n<ul>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n</ul>\n<ul>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>What to expect in the interview process (all video interviews unless accommodations are needed):</p>\n<ul>\n<li>Interview with Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Interview with Hiring Manager</li>\n</ul>\n<ul>\n<li>Team Interviews</li>\n</ul>\n<ul>\n<li>Demo Round</li>\n</ul>\n<p>#LI-LA1</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_58a44dab-91a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673657005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner engineering","customer-facing technical role"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:53:29.744Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Japan - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner engineering, customer-facing technical role, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6c1cd36d-464"},"title":"Senior Security Operations Engineer, Detection & Response","description":"<p>About Us</p>\n<p>dbt Labs is the pioneer of analytics engineering, helping data teams transform raw data into reliable, actionable insights. As of February 2025, we’ve grown from an open source project into the leading analytics engineering platform, now used by over 90,000 teams every week, driving data transformations and AI use cases.</p>\n<p>We’re backed by top-tier investors including Andreessen Horowitz, Sequoia Capital, and Altimeter. At our core, we believe in empowering data practitioners:</p>\n<ul>\n<li>Reliable, high-quality data is the fuel that propels AI-powered data engineering.</li>\n<li>AI is changing data work, fast. dbt’s data control plane keeps data engineers ahead of that curve.</li>\n<li>We empower engineers to deliver reliable, governed data faster, cheaper, and at scale.</li>\n</ul>\n<p>About the Security Team</p>\n<p>The mission of the Security Engineering team at dbt Labs is to provide clear, opinionated security guidance and scalable, secure-by-default offerings to engineers for the purpose of securing software development and enabling pragmatic risk decisions at dbt.</p>\n<p><strong>Responsibilities</strong></p>\n<p>As a Senior Security Operations Engineer on the Detection &amp; Response team, you will strengthen and maintain the company&#39;s security posture throughout the threat detection lifecycle from telemetry collection and continuous monitoring through threat detection, incident response, and security event management. You will serve as a subject matter expert for security operations across the dbt Labs&#39; teams and technology infrastructure, including multi-cloud production environments, identity, endpoints, and SaaS technologies.</p>\n<p><strong>Key Responsibilities</strong></p>\n<ul>\n<li>Participate in a 24/7 on-call rotation providing coverage for active security incidents, investigations, and security events across our global infrastructure.</li>\n<li>Lead investigation and remediation of security incidents, coordinating cross-functional response efforts to minimize impact and recovery time.</li>\n<li>Play a major role in bootstrapping an end to end D&amp;R alert and investigation pipeline.</li>\n<li>Triage and investigate security alerts from detection tools including Wiz Defend, Crowdstrike, and cloud security platforms to identify genuine threats and reduce false positives.</li>\n<li>Develop and maintain detection rules, runbooks, and response procedures mapped to the company&#39;s threat model.</li>\n<li>Automate alert triage workflows and improve mean time to detection and response through tooling and process enhancements, including leveraging AI enrichment and processing.</li>\n<li>Collaborate with Infrastructure and Application Security teams to implement secure-by-design principles and remediate identified security issues.</li>\n<li>Conduct security event analysis to identify policy violations, misconfigurations, and potential attack vectors before they become incidents.</li>\n<li>Partner with our Enterprise Security &amp; Technology team to enhance endpoint security controls and monitoring across endpoints (MacOS laptops &amp; some Windows and Linux-based development environments).</li>\n<li>Design and facilitate tabletop exercises and game days to test detection, response, recovery, and remediation capabilities.</li>\n<li>Contribute to the maturation of the security incident response program through documentation, training, and process improvements.</li>\n<li>Mentor junior security engineers and cross-functional team members on incident handling best practices.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Demonstrated ability to excel in high-pressure situations; we need someone who can make sound decisions during active security incidents and can calmly serve as incident commander with confidence.</li>\n</ul>\n<p><strong>Qualifications</strong></p>\n<ul>\n<li>Have 8+ years of professional experience in security-related domains, including at least 4 years in security operations, incident response, threat hunting, or threat detection roles.</li>\n<li>Have demonstrable experience leading security incident investigations and coordinating cross-team response efforts.</li>\n</ul>\n<p><strong>What We Offer</strong></p>\n<ul>\n<li>Competitive compensation packages commensurate with experience, including salary, equity, and where applicable, performance-based pay.</li>\n<li>Opportunity to work with a leading analytics engineering platform and contribute to the growth and success of the company.</li>\n<li>Collaborative and dynamic work environment with a team of experienced professionals.</li>\n<li>Opportunities for professional growth and development.</li>\n</ul>\n<p><strong>How to Apply</strong></p>\n<p>If you are a motivated and experienced security professional looking for a new challenge, please submit your resume and cover letter to [insert contact information]. We look forward to hearing from you!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6c1cd36d-464","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4674498005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Security Operations","Incident Response","Threat Hunting","Threat Detection","Cloud Security","Endpoint Security","Security Event Analysis","Security Incident Response","Tabletop Exercises","Game Days","Documentation","Training","Process Improvements","Mentoring","Security Engineering","Data Control Plane","Analytics Engineering","AI-Powered Data Engineering","Reliable High-Quality Data","Secure-By-Default Offerings","Pragmatic Risk Decisions","Multi-Cloud Production Environments","Identity","Endpoints","SaaS Technologies","Wiz Defend","Crowdstrike","Cloud Security Platforms","Detection Rules","Runbooks","Response Procedures","Mean Time to Detection","Mean Time to Response","AI Enrichment","AI Processing","Secure-By-Design Principles","Infrastructure Security","Application Security","Endpoint Security Controls","Monitoring"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:52:43.496Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Security Operations, Incident Response, Threat Hunting, Threat Detection, Cloud Security, Endpoint Security, Security Event Analysis, Security Incident Response, Tabletop Exercises, Game Days, Documentation, Training, Process Improvements, Mentoring, Security Engineering, Data Control Plane, Analytics Engineering, AI-Powered Data Engineering, Reliable High-Quality Data, Secure-By-Default Offerings, Pragmatic Risk Decisions, Multi-Cloud Production Environments, Identity, Endpoints, SaaS Technologies, Wiz Defend, Crowdstrike, Cloud Security Platforms, Detection Rules, Runbooks, Response Procedures, Mean Time to Detection, Mean Time to Response, AI Enrichment, AI Processing, Secure-By-Design Principles, Infrastructure Security, Application Security, Endpoint Security Controls, Monitoring"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d9c05c37-885"},"title":"Staff Data Analyst","description":"<p>We&#39;re looking for a Staff Data Analyst to join our Data Science team at Stripe. As a Staff Data Analyst, you will play a key role in strengthening Stripe&#39;s analytics foundation across the company.</p>\n<p>Your primary responsibility will be to lead architecture reviews and set analytical standards. You will own the strategy, technical architecture, and governance model for platforms that make key business metrics consistent, trustworthy, and easy to query at scale. You will also be responsible for owning the hardest cross-cutting analytical problems, driving org-wide data quality and consistency, and shaping long-term technical vision.</p>\n<p>To succeed in this role, you will need to have 10+ years of experience in Data Analysis, Analytics Engineering, Business Intelligence Engineering, or Data Science roles, with deep expertise in SQL and familiarity with AI-assisted development tools. You will also need to have a proven track record of setting long-term technical vision for analytics platforms and experience leading cross-team or org-wide data initiatives.</p>\n<p>As a Staff Data Analyst at Stripe, you will have the opportunity to work with a talented team of data analysts, data scientists, and engineers to build and maintain a robust analytics infrastructure that supports business growth and decision-making.</p>\n<p>If you are a motivated and experienced data professional looking for a challenging role with a high-growth company, we encourage you to apply.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d9c05c37-885","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Stripe","sameAs":"https://stripe.com","logo":"https://logos.yubhub.co/stripe.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/stripe/jobs/7801457","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","AI-assisted development tools","Data analysis","Analytics engineering","Business intelligence engineering","Data science"],"x-skills-preferred":["Warehouse design","Metrics infrastructure","Performance optimization","Distributed data frameworks","Semantic layer or metrics layer"],"datePosted":"2026-04-18T15:51:08.035Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, AI-assisted development tools, Data analysis, Analytics engineering, Business intelligence engineering, Data science, Warehouse design, Metrics infrastructure, Performance optimization, Distributed data frameworks, Semantic layer or metrics layer"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_50808499-c0b"},"title":"Senior Customer Solutions Resident Architect","description":"<p>About Us</p>\n<p>dbt Labs is the pioneer of analytics engineering, helping data teams transform raw data into reliable, actionable insights. Since 2016, we’ve grown from an open source project into the leading analytics engineering platform, now used by over 90,000 teams every week, driving data transformations and AI use cases.</p>\n<p>As of February 2025, we’ve surpassed $100 million in annual recurring revenue (ARR) and serve more than 5,400 dbt Platform customers, including AstraZeneca, Sky, Nasdaq, Volvo, JetBlue, and SafetyCulture.</p>\n<p>We’re backed by top-tier investors including Andreessen Horowitz, Sequoia Capital, and Altimeter. At our core, we believe in empowering data practitioners:</p>\n<ul>\n<li>Reliable, high-quality data is the fuel that propels AI-powered data engineering.</li>\n</ul>\n<ul>\n<li>AI is changing data work, fast. dbt’s data control plane keeps data engineers ahead of that curve.</li>\n</ul>\n<ul>\n<li>We empower engineers to deliver reliable, governed data faster, cheaper, and at scale.</li>\n</ul>\n<p>dbt Labs is now synonymous with analytics engineering, defining the modern data stack and serving as the data control plane for enterprise teams around the world. And we’re just getting started..</p>\n<p>We’re growing fast and building a team of passionate, curious people across the globe. Learn more about what makes us special by checking out our values.</p>\n<p><strong>About the Role</strong></p>\n<p>We are seeking an experienced Senior Customer Solutions Resident Architects to join our team. In this role, you will drive critical customer outcomes by delivering high-impact technical guidance to strategic accounts. You will be part of a high-visibility initiative that supports pre-sales, accelerates adoption, enables key migrations, and mitigates churn risks.</p>\n<p>This role is designed to deploy RA-level expertise flexibly, aligning with customer and business needs to drive growth, retention, and expansion.</p>\n<p><strong>What You’ll Do</strong></p>\n<ul>\n<li>Accelerate Customer Success Across the Lifecycle</li>\n</ul>\n<ul>\n<li>Support strategic pre-sales opportunities by providing technical expertise to prospects</li>\n</ul>\n<ul>\n<li>Assist in launching and onboarding new customers who have not purchased RA services, ensuring they successfully adopt dbt Cloud</li>\n</ul>\n<ul>\n<li>Execute proactive adoption plays, including migrations, new feature implementations (e.g., Semantic Layer, Mesh), and major version upgrades</li>\n</ul>\n<ul>\n<li>Lead reactive adoption initiatives to de-risk churn or contraction and position accounts for future growth</li>\n</ul>\n<ul>\n<li>Deliver Technical Excellence</li>\n</ul>\n<ul>\n<li>Advise on architecture, design, implementation, troubleshooting, and best practices in dbt Cloud environments</li>\n</ul>\n<ul>\n<li>Build solution MVPs and guide long-term technical strategies tailored to customer needs</li>\n</ul>\n<ul>\n<li>Engage on multiple projects simultaneously with clear scoping, start and end dates, and outcome tracking</li>\n</ul>\n<ul>\n<li>Collaborate Across Teams</li>\n</ul>\n<ul>\n<li>Partner closely with Customer Solutions Architects (CSAs), Sales, Solutions Architects, Training, and Support</li>\n</ul>\n<ul>\n<li>Provide feedback to Product and Engineering to improve customer experience and prioritize technical needs</li>\n</ul>\n<ul>\n<li>Champion customer success through thoughtful, transparent communication and cross-functional collaboration</li>\n</ul>\n<ul>\n<li>Advance Best Practices and Team Impact</li>\n</ul>\n<ul>\n<li>Help build out and refine this evolving function alongside the broader RA organization</li>\n</ul>\n<ul>\n<li>Track and manage capacity and engagement effectiveness similarly to other RA-led initiatives</li>\n</ul>\n<p><strong>What You’ll Need</strong></p>\n<ul>\n<li>5+ years of experience in technical customer-facing roles such as post-sales consulting, technical architecture, or solution delivery</li>\n</ul>\n<ul>\n<li>Expertise with at least one modern cloud data platform (Snowflake, Databricks, BigQuery, or Redshift)</li>\n</ul>\n<ul>\n<li>Hands-on experience deploying or configuring dbt Cloud, with at least 1 year working with dbt</li>\n</ul>\n<ul>\n<li>Strong proficiency in SQL; working knowledge of Python in analytics contexts preferred</li>\n</ul>\n<ul>\n<li>Comfort leading technical project delivery , managing scope, timelines, and stakeholder expectations across multiple simultaneous engagements</li>\n</ul>\n<ul>\n<li>Clear, concise communication skills for both technical and executive audiences</li>\n</ul>\n<ul>\n<li>A collaborative mindset , thriving in a remote, transparent, and highly cross-functional organization</li>\n</ul>\n<ul>\n<li>Willingness to travel 2–4 times per year for company-wide events</li>\n</ul>\n<p><strong>What Will Make You Stand Out</strong></p>\n<ul>\n<li>dbt Analytics Engineering Certification</li>\n</ul>\n<ul>\n<li>Ability to influence technical direction and build consensus across internal and customer teams</li>\n</ul>\n<ul>\n<li>Experience with traditional enterprise ETL tools (e.g., Informatica, Datastage, Talend) and how they relate to modern data workflows</li>\n</ul>\n<ul>\n<li>Familiarity with strategic sales or renewal processes, including proactive and reactive adoption efforts</li>\n</ul>\n<ul>\n<li>Proven success accelerating usage, adoption, and expansion in large, complex accounts</li>\n</ul>\n<p><strong>Remote Hiring Process</strong></p>\n<ul>\n<li>Interview with a Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Interview with Hiring Manager</li>\n</ul>\n<ul>\n<li>Task</li>\n</ul>\n<ul>\n<li>Task Review</li>\n</ul>\n<ul>\n<li>Final Values Interview</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Unlimited vacation time with a culture that actively encourages time off</li>\n</ul>\n<ul>\n<li>401k plan with 3% guaranteed company contribution</li>\n</ul>\n<ul>\n<li>Comprehensive healthcare coverage</li>\n</ul>\n<ul>\n<li>Generous paid parental leave</li>\n</ul>\n<ul>\n<li>Health &amp; wellness stipend</li>\n</ul>\n<ul>\n<li>Flexible stipends for:</li>\n</ul>\n<ul>\n<li>Home office setup</li>\n</ul>\n<ul>\n<li>Learning and development</li>\n</ul>\n<ul>\n<li>Office space</li>\n</ul>\n<ul>\n<li>And more!</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>We offer competitive compensation packages commensurate with experience, including salary, equity, and where applicable, performance-based pay. Our Talent Acquisition Team can answer questions around dbt Lab’s total rewards during your interview process.</p>\n<p>In Boston, Chicago, Denver, Los Angeles, Philadelphia, New York Metro, San Francisco, DC Metro, Seattle, and Austin, an alternate range may apply, as specified below.</p>\n<ul>\n<li>The typical starting salary range for this role in the specific locations listed is: $163,000 - $200,000</li>\n</ul>\n<ul>\n<li>The typical starting salary range for this role is: $146,000 - $180,000</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_50808499-c0b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4682381005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$146,000 - $180,000","x-skills-required":["modern cloud data platform","dbt Cloud","SQL","Python","technical project delivery","clear, concise communication skills"],"x-skills-preferred":["dbt Analytics Engineering Certification","traditional enterprise ETL tools","strategic sales or renewal processes","proven success accelerating usage, adoption, and expansion"],"datePosted":"2026-04-18T15:50:33.608Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"modern cloud data platform, dbt Cloud, SQL, Python, technical project delivery, clear, concise communication skills, dbt Analytics Engineering Certification, traditional enterprise ETL tools, strategic sales or renewal processes, proven success accelerating usage, adoption, and expansion","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":146000,"maxValue":180000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e40d534f-76a"},"title":"Resident Architect","description":"<p>About Us</p>\n<p>dbt Labs is the pioneer of analytics engineering, helping data teams transform raw data into reliable, actionable insights. As of February 2025, we&#39;ve surpassed $100 million in annual recurring revenue (ARR) and serve more than 5,400 dbt Platform customers.</p>\n<p>We&#39;re seeking an experienced Resident Architect (RA) with a passion for solving challenging problems with dbt to join our Professional Services team. RAs are billable to dbt Enterprise customers and help achieve our mission to empower data developers to create and disseminate organisational knowledge.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Work on a variety of impactful customer technical projects - inclusive of implementation, troubleshooting configurations, instilling best practices, and solutioning MVPs and long-term solutions to customer-specific requirements</li>\n</ul>\n<ul>\n<li>Consult on architecture and design</li>\n</ul>\n<ul>\n<li>Ensure our most strategic enterprise customers are adopting the product</li>\n</ul>\n<ul>\n<li>Collaborate with other internal customer-facing teams at dbt Labs - Sales, Solution Architects, Training, Support</li>\n</ul>\n<ul>\n<li>Provide critical feedback to dbt Labs product and engineering teams to improve and prioritise customer requests and ensure rapid resolution for engagement-specific issues</li>\n</ul>\n<ul>\n<li>Become a product expert with dbt in the context of the modern data stack (if you aren&#39;t already)</li>\n</ul>\n<p>What You&#39;ll Need</p>\n<ul>\n<li>4+ years&#39; experience working with technical data tooling, even better if it is in a customer-facing post-sales, technical architect or consulting role</li>\n</ul>\n<ul>\n<li>Deep expertise in at least one data platform (Snowflake, Databricks, BigQuery, Redshift)</li>\n</ul>\n<ul>\n<li>Experience using, deploying, or configuring dbt in an enterprise setting - working with dbt for minimum 1 year</li>\n</ul>\n<ul>\n<li>Proficiency in writing SQL and Python in analytics contexts</li>\n</ul>\n<ul>\n<li>You look forward to building skills in technical areas that support deployment and integration of dbt enterprise solutions to complete customer projects</li>\n</ul>\n<ul>\n<li>Customer focus, embracing one of core values that users are our best advocates</li>\n</ul>\n<ul>\n<li>Strong organisational skills with the ability to manage multiple technical projects simultaneously - including defining scope, tracking timelines, and ensuring deliverables are met</li>\n</ul>\n<ul>\n<li>Clear and concise communicator with the ability to engage internal and external stakeholders, effectively explain complex technical or organisational challenges, and propose thoughtful, iterative solutions</li>\n</ul>\n<ul>\n<li>The ability to thrive in a remote organisation that highly values transparency and cross-collaboration</li>\n</ul>\n<ul>\n<li>Travel approximately 2-4x/year for customer onsite sessions, team offsites, and company events will be expected</li>\n</ul>\n<p>What Will Make You Stand Out</p>\n<ul>\n<li>You have obtained the dbt Analytics Engineering Certification</li>\n</ul>\n<ul>\n<li>You have the ability to advise on dbt enterprise recommendations, and build direction/consensus with the customer to move forward</li>\n</ul>\n<ul>\n<li>Experience with traditional Enterprise ETL tooling (Informatica, Datastage, Talend)</li>\n</ul>\n<p>Remote Hiring Process</p>\n<ul>\n<li>Interview with a Talent Acquisition Partner</li>\n</ul>\n<ul>\n<li>Hiring Manager Interview</li>\n</ul>\n<ul>\n<li>Technical Task + Presentation</li>\n</ul>\n<ul>\n<li>Team Interview</li>\n</ul>\n<p>Benefits</p>\n<ul>\n<li>Unlimited vacation time with a culture that actively encourages time off</li>\n</ul>\n<ul>\n<li>401k plan with 3% guaranteed company contribution</li>\n</ul>\n<ul>\n<li>Comprehensive healthcare coverage</li>\n</ul>\n<ul>\n<li>Generous paid parental leave</li>\n</ul>\n<ul>\n<li>Flexible stipends for:</li>\n</ul>\n<ul>\n<li>Health &amp; Wellness</li>\n</ul>\n<ul>\n<li>Home Office Setup</li>\n</ul>\n<ul>\n<li>Cell Phone &amp; Internet</li>\n</ul>\n<ul>\n<li>Learning &amp; Development</li>\n</ul>\n<ul>\n<li>Office Space</li>\n</ul>\n<p>Compensation</p>\n<p>We offer competitive compensation packages commensurate with experience, including salary, equity, and where applicable, performance-based pay. Our Talent Acquisition Team can answer questions around dbt Lab&#39;s total rewards during your interview process.</p>\n<p>In select locations (including Boston, Chicago, Denver, Los Angeles, Philadelphia, New York City, San Francisco, Washington, DC, and Seattle), an alternate range may apply, as specified below.</p>\n<ul>\n<li>The typical starting salary range for this role is:</li>\n</ul>\n<p>$114,000 - $137,700</p>\n<ul>\n<li>The typical starting salary range for this role in the select locations listed is:</li>\n</ul>\n<p>$126,000 - $153,000</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e40d534f-76a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4627942005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$114,000 - $137,700","x-skills-required":["dbt","data platform","Snowflake","Databricks","BigQuery","Redshift","SQL","Python","analytics engineering"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:49:56.862Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"dbt, data platform, Snowflake, Databricks, BigQuery, Redshift, SQL, Python, analytics engineering","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":114000,"maxValue":137700,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3168d7d3-70b"},"title":"Partner Solutions Architect - North America","description":"<p>About Us</p>\n<p>We&#39;re looking for a Partner Solutions Architect to join the Field Engineering team and help scale dbt&#39;s partner go-to-market motion across North America. This role is focused on building technical and commercial momentum with both consulting and technology partners.</p>\n<p>As a Partner Solutions Architect, you will work closely with Partner Development Managers to drive partner capability, field alignment, and pipeline across strategic SI and consulting partners as well as key technology partners such as Snowflake, Databricks, and Google Cloud. Internally, this role sits at the intersection of Field Engineering, Partnerships, Sales, Product, and Partner Marketing.</p>\n<p>Responsibilities</p>\n<ul>\n<li>Partner closely with North America Partner Development Managers to execute joint GTM plans across technology and SI/consulting partners.</li>\n<li>Build trusted technical relationships with partner architects, sellers, and practice leaders</li>\n<li>Run partner enablement sessions, workshops, office hours, and hands-on technical trainings to improve partner capability and field readiness</li>\n<li>Support account mapping and seller-to-seller alignment between dbt and partner field teams to uncover and accelerate pipeline</li>\n<li>Help create and refine repeatable sales plays across themes like core-to-cloud migration, modernization, AI-ready data foundations, marketplace, semantic layer, and partner platform adoption</li>\n<li>Support partner-led and tri-party pipeline generation efforts including QBRs, innovation days, lunch-and-learns, hands-on labs, and local field events</li>\n<li>Equip partner teams with the technical messaging, demo narratives, architectures, and customer use cases needed to position dbt effectively</li>\n<li>Collaborate with dbt Account Executives, Sales Engineers, and regional sales leadership to drive co-sell execution in target accounts</li>\n<li>Act as a technical bridge between partners and dbt Product / Engineering by surfacing integration gaps, field feedback, competitive insights, and roadmap opportunities</li>\n<li>Serve as an internal subject matter expert on dbt’s major technology partner ecosystem, especially Snowflake, Databricks, and Google Cloud</li>\n<li>Contribute to the scale motion by helping build collateral, playbooks, enablement assets, and best practices that raise the bar across the broader Partner SA function</li>\n</ul>\n<p>Requirements</p>\n<ul>\n<li>5+ years of experience in solutions architecture, sales engineering, consulting, partner engineering, or another customer-facing technical role in data and analytics</li>\n<li>Strong hands-on background in SQL, data modeling, analytics engineering, and modern data platforms</li>\n<li>Ability to clearly explain modern data stack architectures and how dbt fits across warehouses, lakehouses, semantic layers, and AI-oriented workflows</li>\n<li>Experience translating technical capabilities into clear business value for both technical and non-technical audiences</li>\n<li>Comfort operating in highly cross-functional environments across Sales, Partnerships, Product, and Marketing</li>\n<li>Strong presentation, workshop, and facilitation skills, including external enablement and customer-facing sessions</li>\n<li>Proven ability to drive outcomes in ambiguous, fast-moving environments with multiple stakeholders</li>\n<li>Experience supporting complex enterprise buying motions, proof-of-value work, or partner-influenced sales cycles</li>\n<li>Strong written communication skills for building collateral, technical narratives, and partner-facing content</li>\n<li>A collaborative mindset and a desire to help scale best practices across a growing team</li>\n</ul>\n<p>What will make you stand out</p>\n<ul>\n<li>Experience working directly in partner, alliance, or ecosystem roles</li>\n<li>Experience with Snowflake, Databricks, BigQuery / Google Cloud, AWS, or Microsoft Fabric in a GTM or solutions context</li>\n<li>Experience enabling systems integrators, consulting firms, or technology partner field teams</li>\n<li>Familiarity with cloud marketplace motions, co-sell programs, and partner-sourced pipeline generation</li>\n<li>Prior experience with dbt, analytics engineering workflows, or adjacent tooling in transformation, orchestration, governance, or metadata</li>\n<li>Strong instincts for identifying repeatable plays that connect enablement activity to measurable pipeline outcomes</li>\n<li>Ability to influence both strategy and execution, from partner messaging and field enablement to product feedback and GTM refinement</li>\n<li>A track record of building credibility quickly with partner sellers, partner architects, and internal field teams</li>\n</ul>\n<p>Benefits</p>\n<ul>\n<li>Unlimited vacation (and yes we use it!)</li>\n<li>Pension coverage</li>\n<li>Excellent healthcare</li>\n<li>Paid Parental Leave</li>\n<li>Wellness stipend</li>\n<li>Home office stipend, and more!</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3168d7d3-70b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"dbt Labs","sameAs":"https://www.getdbt.com/","logo":"https://logos.yubhub.co/getdbt.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/dbtlabsinc/jobs/4673630005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["SQL","data modeling","analytics engineering","modern data platforms","Snowflake","Databricks","Google Cloud","partner development","field engineering","sales engineering","consulting","partner engineering"],"x-skills-preferred":["cloud marketplace motions","co-sell programs","partner-sourced pipeline generation","dbt","analytics engineering workflows","transformation","orchestration","governance","metadata"],"datePosted":"2026-04-18T15:48:30.813Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Canada - Remote; US - Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, data modeling, analytics engineering, modern data platforms, Snowflake, Databricks, Google Cloud, partner development, field engineering, sales engineering, consulting, partner engineering, cloud marketplace motions, co-sell programs, partner-sourced pipeline generation, dbt, analytics engineering workflows, transformation, orchestration, governance, metadata"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8ea1c59a-b10"},"title":"Analytics Engineer","description":"<p><strong>About the role</strong></p>\n<p>Behind many of life&#39;s most important transactions , buying a house, applying for a mortgage, getting a small business loan, or refinancing a credit card , is a network of credit relationships. Setpoint provides critical infrastructure for relationships between the world&#39;s largest banks, credit funds and capital markets counterparties.</p>\n<p>We are looking for an Analytics Engineer to join our team supporting our external data products. In this position, you will be reporting into our Head of Analytics and partnering closely with engineering, product, and implementation teams to build and maintain the data infrastructure that powers our products. This is a hands-on role where you&#39;ll design and maintain data pipelines, create dashboards, and work directly with stakeholders to deliver insights that drive business outcomes.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Own end-to-end implementations for the data products for our external customers.</li>\n<li>Build and maintain scalable data pipelines and analytics models using Python, DBT, and SQL.</li>\n<li>Create and manage dashboards in our external and internal analytics products so stakeholders have actionable insights.</li>\n<li>Prototype new data products to add to our product suite.</li>\n<li>Collaborate with engineering to drive the roadmap for the data products.</li>\n<li>Use GitHub-based workflows to maintain clean, version-controlled analytics code.</li>\n<li>Act as a subject matter expert for analytics best practices.</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>5+ years of experience in analytics engineering, data engineering, or a similar technical role.</li>\n<li>Proven expertise DBT, SQL, and GitHub-based development. Python is a plus.</li>\n<li>Strong experience designing, implementing, and maintaining data pipelines and models.</li>\n<li>Experience in a customer-facing role, working with technical and business stakeholders.</li>\n<li>Experience working with asset managers, business owners, or financial services data sets is a plus.</li>\n<li>Excellent problem-solving, communication, and collaboration skills.</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Competitive salaries</li>\n<li>Stock options</li>\n<li>Medical, dental, and vision coverage</li>\n<li>401(k)</li>\n<li>Short term and long term disability coverage</li>\n<li>Flexible vacation</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>$150,000 - $170,000 OTE dependent on multiple factors, which may include the candidate&#39;s skills, experience, location, and other qualifications.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8ea1c59a-b10","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Setpoint","sameAs":"https://setpoint.com","logo":"https://logos.yubhub.co/setpoint.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/setpoint/jobs/4801438007","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$150,000 - $170,000","x-skills-required":["Python","SQL","DBT","GitHub","Data engineering","Analytics engineering"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:57:45.110Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Austin or New York (Hybrid)"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"Python, SQL, DBT, GitHub, Data engineering, Analytics engineering","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":150000,"maxValue":170000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8a8c0eb9-6e6"},"title":"Data Scientist, Product","description":"<p><strong>Job Title: Data Scientist, Product</strong></p>\n<p>This is the founding hire for product analytics at Hebbia. As a data scientist, you will define what our core product metrics are: what counts as an active user, what engagement actually means, what signals correlate with retention.</p>\n<p>This is not a dashboarding role. The goal is to shape product decisions with data, not just report on them. You will identify which workflows drive repeat usage, where users drop off, what features move engagement, and what differentiates power users from casual users across our enterprise customer base.</p>\n<p>The role sits at the intersection of analytics engineering, product analytics, and data science. You will build the infrastructure and do the analysis. Define the metrics, build the pipelines, create the dashboards, and use what you built to inform the roadmap.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Define and implement Hebbia&#39;s core product metrics from scratch: active users, engagement, retention, feature adoption, account health. Build the canonical definitions the entire company uses.</li>\n<li>Design and build the product analytics infrastructure: fact tables, clean data models, and the analytics layer that sits on top of our product data.</li>\n<li>Build and maintain executive and product dashboards that leadership and product teams use to make decisions.</li>\n<li>Write DAGs, transforms, and data pipelines that support analytics. Work with engineering to instrument the product so usage data is captured correctly.</li>\n<li>Analyze customer behavior across our B2B customer base: account-level usage patterns, workflow adoption, expansion signals, and churn risk indicators.</li>\n<li>Inform the product roadmap using data. Identify friction in user flows, surface feature adoption patterns, and highlight opportunities for product improvement.</li>\n<li>Partner with product managers and engineers to translate product questions into measurable data and structured experiments.</li>\n<li>Establish data quality standards and documentation so the metrics layer you build is trusted and maintained.</li>\n</ul>\n<p><strong>Who You Are</strong></p>\n<ul>\n<li>3+ years of experience in product analytics, analytics engineering, or data science at a B2B SaaS company or high-growth startup</li>\n<li>Strong in SQL and Python. You can write production-quality transforms, not just ad hoc queries.</li>\n<li>Experience with modern data stack tools: dbt, Airflow, Snowflake, BigQuery, or similar. You understand data modeling and warehouse architecture.</li>\n<li>You have built dashboards and reporting that product teams and leadership actually use to make decisions</li>\n<li>You understand B2B product analytics: account-level metrics, multi-user workflows, enterprise engagement patterns, and why B2B retention analysis is different from consumer</li>\n<li>You translate ambiguous product questions into structured analyses. You do not wait for someone to hand you a spec.</li>\n<li>Strong product intuition. You care about why users behave the way they do, not just what the numbers say.</li>\n<li>Clear communicator. You can present findings to engineers, product managers, and executives with equal effectiveness.</li>\n</ul>\n<p><strong>Compensation</strong></p>\n<p>The salary range for this position is set between $180,000 to $260,000. This range may be inclusive of several career levels at Hebbia and will be narrowed during the interview process based on the candidate’s experience and qualifications.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8a8c0eb9-6e6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Hebbia","sameAs":"https://hebbia.com/","logo":"https://logos.yubhub.co/hebbia.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/hebbia/jobs/4670090005","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$180,000 - $260,000","x-skills-required":["SQL","Python","dbt","Airflow","Snowflake","BigQuery","data modeling","warehouse architecture","product analytics","analytics engineering","data science"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:37:39.339Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"New York City; San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"SQL, Python, dbt, Airflow, Snowflake, BigQuery, data modeling, warehouse architecture, product analytics, analytics engineering, data science","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":260000,"unitText":"YEAR"}}}]}