{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/data-warehousing-solutions"},"x-facet":{"type":"skill","slug":"data-warehousing-solutions","display":"Data Warehousing Solutions","count":4},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7275ef33-009"},"title":"Staff Data Engineer","description":"<p>At Bayer, we&#39;re seeking a Staff Data Engineer to join our team. As a Staff Data Engineer, you will design and lead the implementation of data flows to connect operational systems, data for analytics and business intelligence (BI) systems. You will recognize opportunities to reuse existing data flows, lead the build of data streaming systems, optimize the code to ensure processes perform optimally, and lead work on database management.</p>\n<p>Communicating Between Technical and Non-Technical Colleagues</p>\n<p>As a Staff Data Engineer, you will communicate effectively with technical and non-technical stakeholders, support and host discussions within a multidisciplinary team, and be an advocate for the team externally.</p>\n<p>Data Analysis and Synthesis</p>\n<p>You will undertake data profiling and source system analysis, present clear insights to colleagues to support the end use of the data.</p>\n<p>Data Development Process</p>\n<p>You will design, build and test data products that are complex or large scale, build teams to complete data integration services.</p>\n<p>Data Innovation</p>\n<p>You will understand the impact on the organization of emerging trends in data tools, analysis techniques and data usage.</p>\n<p>Data Integration Design</p>\n<p>You will select and implement the appropriate technologies to deliver resilient, scalable and future-proofed data solutions and integration pipelines.</p>\n<p>Data Modeling</p>\n<p>You will produce relevant data models across multiple subject areas, explain which models to use for which purpose, understand industry-recognised data modelling patterns and standards, and when to apply them, compare and align different data models.</p>\n<p>Metadata Management</p>\n<p>You will design an appropriate metadata repository and present changes to existing metadata repositories, understand a range of tools for storing and working with metadata, provide oversight and advice to more inexperienced members of the team.</p>\n<p>Problem Resolution</p>\n<p>You will respond to problems in databases, data processes, data products and services as they occur, initiate actions, monitor services and identify trends to resolve problems, determine the appropriate remedy and assist with its implementation, and with preventative measures.</p>\n<p>Programming and Build</p>\n<p>You will use agreed standards and tools to design, code, test, correct and document moderate-to-complex programs and scripts from agreed specifications and subsequent iterations, collaborate with others to review specifications where appropriate.</p>\n<p>Technical Understanding</p>\n<p>You will understand the core technical concepts related to the role, and apply them with guidance.</p>\n<p>Testing</p>\n<p>You will review requirements and specifications, and define test conditions, identify issues and risks associated with work, analyse and report test activities and results.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7275ef33-009","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Bayer","sameAs":"https://talent.bayer.com","logo":"https://logos.yubhub.co/talent.bayer.com.png"},"x-apply-url":"https://talent.bayer.com/careers/job/562949976928777","x-work-arrangement":"remote","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$114,400 to $171,600","x-skills-required":["Proficiency in programming language such as Python or Java","Experience with Big Data technologies such as Hadoop, Spark, and Kafka","Familiarity with ETL processes and tools","Knowledge of SQL and NoSQL databases","Strong understanding of relational databases","Experience with data warehousing solutions","Proficiency with cloud platforms","Expertise in data modeling and design","Experience in designing and building scalable data pipelines","Experience with RESTful APIs and data integration"],"x-skills-preferred":["Relevant certifications (e.g., GCP Certified, AWS Certified, Azure Certified)","Bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related field","Strong analytical and communication skills","Ability to work collaboratively in a team environment","High level of accuracy and attention to detail"],"datePosted":"2026-04-18T22:12:56.654Z","jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Healthcare","skills":"Proficiency in programming language such as Python or Java, Experience with Big Data technologies such as Hadoop, Spark, and Kafka, Familiarity with ETL processes and tools, Knowledge of SQL and NoSQL databases, Strong understanding of relational databases, Experience with data warehousing solutions, Proficiency with cloud platforms, Expertise in data modeling and design, Experience in designing and building scalable data pipelines, Experience with RESTful APIs and data integration, Relevant certifications (e.g., GCP Certified, AWS Certified, Azure Certified), Bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related field, Strong analytical and communication skills, Ability to work collaboratively in a team environment, High level of accuracy and attention to detail","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":114400,"maxValue":171600,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_25f010f0-7d1"},"title":"Data Engineer","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Brex’s AI-native automation and world-class service eliminate manual expense and accounting tasks for customers so they can focus on what matters most. Tens of thousands of the world&#39;s best companies run on Brex, including DoorDash, Coinbase, Robinhood, Zoom, Plaid, Reddit, and SeatGeek.</p>\n<p>Working at Brex allows you to push your limits, challenge the status quo, and collaborate with some of the brightest minds in the industry. We’re committed to building a diverse team and inclusive culture and believe your potential should only be limited by how big you can dream. We make this a reality by empowering you with the tools, resources, and support you need to grow your career.</p>\n<p>Data at Brex</p>\n<p>Our Scientists and Engineers work together to make data , and insights derived from data , a core asset across Brex. But it&#39;s more than just crunching numbers. The Data team at Brex develops infrastructure, statistical models, and products using data. Our work is ingrained in Brex&#39;s decision-making process, the efficiency of our operations, our risk management policies, and the unparalleled experience we provide our customers.</p>\n<p>What You’ll Do</p>\n<p>As a Data Engineer at Brex, you will be a core contributor in transforming raw data into actionable insights for various departments across the organization. You&#39;ll collaborate closely with Data Scientists, Software Engineers, and business units to create efficient data models, pipelines, and analytics frameworks that drive the business forward. You also play a leading role in the design, implementation, and maintenance of Core Data tables, our high-quality, curated data source for a wide range of analytic applications.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our San Francisco office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, build, and maintain data models and pipelines that scale with the growing number of services, products, and changes in the company.</li>\n</ul>\n<ul>\n<li>Collaborate closely with Data Scientists, Data Analysts, and Business teams to understand their data needs, translating them into robust, efficient, scalable data solutions that enable ease of predictive analytics, data analysis, and metrics formulation.</li>\n</ul>\n<ul>\n<li>Maintain data documentation and definitions, building and ensuring that source-of-truth tables remain high quality for data science and reporting applications.</li>\n</ul>\n<ul>\n<li>Develop and enable integration with various data sources, allowing for more data-driven initiatives across the company.</li>\n</ul>\n<ul>\n<li>Apply best practices in data management to ensure the reliability and robustness of data utilized across various analytics applications.</li>\n</ul>\n<ul>\n<li>Set and proliferate company-wide standards for data relating to structure, quality, and expectations.</li>\n</ul>\n<ul>\n<li>Act as a liaison between the technical and non-technical teams, bridging gaps and ensuring that data solutions align with business objectives.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>3+ years of experience in Data Engineering, Data Analytics, or a related field such as Analytics Engineering.</li>\n</ul>\n<ul>\n<li>2+ years of experience working with modern data transformation tools like DBT.</li>\n</ul>\n<ul>\n<li>Advanced knowledge of databases and SQL with the ability to efficiently stage, process, and transform data.</li>\n</ul>\n<ul>\n<li>Experience integrating and orchestrating data workflows with various modern data tools and systems.</li>\n</ul>\n<ul>\n<li>Experience with data modeling, ETL/ELT processes, and data warehousing solutions.</li>\n</ul>\n<ul>\n<li>Experience working with a data warehouse such as Snowflake.</li>\n</ul>\n<ul>\n<li>Experience with a data workflow orchestrator tool such as Airflow.</li>\n</ul>\n<ul>\n<li>Experience with a programming language such as Python.</li>\n</ul>\n<ul>\n<li>Familiarity with BI tools such as Looker, Tableau, or similar platforms is a plus.</li>\n</ul>\n<ul>\n<li>Exceptional quantitative and analytical skills.</li>\n</ul>\n<ul>\n<li>Strong communication skills and ability to collaborate with various stakeholders, both technical and non-technical.</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $120,800 - $151,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_25f010f0-7d1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8366850002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$120,800 - $151,000","x-skills-required":["DBT","databases","SQL","data modeling","ETL/ELT processes","data warehousing solutions","Snowflake","Airflow","Python","BI tools","Looker","Tableau"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:18.514Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"DBT, databases, SQL, data modeling, ETL/ELT processes, data warehousing solutions, Snowflake, Airflow, Python, BI tools, Looker, Tableau","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120800,"maxValue":151000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1d204fa1-067"},"title":"Data Engineer","description":"<p>Why join us</p>\n<p>Brex is the intelligent finance platform that enables companies to spend smarter and move faster in more than 200 markets. By combining global corporate cards and banking with intuitive spend management, bill pay, and travel software, Brex enables founders and finance teams to accelerate operations, gain real-time visibility, and control spend effortlessly.</p>\n<p>Data at Brex</p>\n<p>Our Scientists and Engineers work together to make data , and insights derived from data , a core asset across Brex. But it&#39;s more than just crunching numbers. The Data team at Brex develops infrastructure, statistical models, and products using data. Our work is ingrained in Brex&#39;s decision-making process, the efficiency of our operations, our risk management policies, and the unparalleled experience we provide our customers.</p>\n<p>What You’ll Do</p>\n<p>As a Data Engineer at Brex, you will be a core contributor in transforming raw data into actionable insights for various departments across the organization. You&#39;ll collaborate closely with Data Scientists, Software Engineers, and business units to create efficient data models, pipelines, and analytics frameworks that drive the business forward. You also play a leading role in the design, implementation, and maintenance of Core Data tables, our high-quality, curated data source for a wide range of analytic applications.</p>\n<p>Where you’ll work</p>\n<p>This role will be based in our Seattle office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, build, and maintain data models and pipelines that scale with the growing number of services, products, and changes in the company.</li>\n</ul>\n<ul>\n<li>Collaborate closely with Data Scientists, Data Analysts, and Business teams to understand their data needs, translating them into robust, efficient, scalable data solutions that enable ease of predictive analytics, data analysis, and metrics formulation.</li>\n</ul>\n<ul>\n<li>Maintain data documentation and definitions, building and ensuring that source-of-truth tables remain high quality for data science and reporting applications.</li>\n</ul>\n<ul>\n<li>Develop and enable integration with various data sources, allowing for more data-driven initiatives across the company.</li>\n</ul>\n<ul>\n<li>Apply best practices in data management to ensure the reliability and robustness of data utilized across various analytics applications.</li>\n</ul>\n<ul>\n<li>Set and proliferate company-wide standards for data relating to structure, quality, and expectations.</li>\n</ul>\n<ul>\n<li>Act as a liaison between the technical and non-technical teams, bridging gaps and ensuring that data solutions align with business objectives.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>3+ years of experience in Data Engineering, Data Analytics, or a related field such as Analytics Engineering.</li>\n</ul>\n<ul>\n<li>2+ years of experience working with modern data transformation tools like DBT.</li>\n</ul>\n<ul>\n<li>Advanced knowledge of databases and SQL with the ability to efficiently stage, process, and transform data.</li>\n</ul>\n<ul>\n<li>Experience integrating and orchestrating data workflows with various modern data tools and systems.</li>\n</ul>\n<ul>\n<li>Experience with data modeling, ETL/ELT processes, and data warehousing solutions.</li>\n</ul>\n<ul>\n<li>Experience working with a data warehouse such as Snowflake.</li>\n</ul>\n<ul>\n<li>Experience with a data workflow orchestrator tool such as Airflow.</li>\n</ul>\n<ul>\n<li>Experience with a programming language such as Python.</li>\n</ul>\n<ul>\n<li>Familiarity with BI tools such as Looker, Tableau, or similar platforms is a plus.</li>\n</ul>\n<ul>\n<li>Exceptional quantitative and analytical skills.</li>\n</ul>\n<ul>\n<li>Strong communication skills and ability to collaborate with various stakeholders, both technical and non-technical.</li>\n</ul>\n<p>Compensation:</p>\n<p>The expected salary range for this role is $120,800 - $151,000. However, the starting base pay will depend on a number of factors including the candidate’s location, skills, experience, market demands, and internal pay parity. Depending on the position offered, equity and other forms of compensation may be provided as part of a total compensation package.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1d204fa1-067","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Brex","sameAs":"https://brex.com/","logo":"https://logos.yubhub.co/brex.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/brex/jobs/8510493002","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$120,800 - $151,000","x-skills-required":["DBT","databases","SQL","data modeling","ETL/ELT processes","data warehousing solutions","Snowflake","Airflow","Python","BI tools","Looker","Tableau"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:02.393Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Seattle, Washington, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Finance","skills":"DBT, databases, SQL, data modeling, ETL/ELT processes, data warehousing solutions, Snowflake, Airflow, Python, BI tools, Looker, Tableau","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120800,"maxValue":151000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_651a6835-81f"},"title":"Member of Revenue Strategy & Operations, Marketing Analytics","description":"<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto. We are looking for a high-impact Marketing Analytics &amp; Operations Manager to architect and scale our marketing data ecosystem.</p>\n<p>This role goes beyond reporting,you will be responsible for building the foundation that powers our go-to-market strategy, from attribution modeling to executive decision-making. You will partner closely with marketing, sales, and revenue leadership to connect marketing efforts to business outcomes, enabling smarter investments and accelerating growth in the institutional crypto ecosystem.</p>\n<p><strong>Core Competencies</strong></p>\n<ul>\n<li>Advanced SQL proficiency: Ability to extract, transform, and analyze large datasets across multiple systems to generate insights</li>\n<li>Salesforce expertise (2–3+ years): Hands-on experience as a Salesforce admin supporting marketing and GTM teams, including campaign tracking, attribution, and data architecture</li>\n<li>Marketing data architecture ownership: Experience integrating and scaling data across tools (e.g., CRM, marketing automation, analytics platforms)</li>\n<li>Analytical rigor + creative problem-solving: Strong critical thinker who can challenge assumptions, identify root causes, and design innovative solutions</li>\n<li>Cross-functional leadership: Proven ability to collaborate across marketing, sales, RevOps, and product teams to drive aligned outcomes</li>\n</ul>\n<p><strong>Technical Skills:</strong></p>\n<p>Technical &amp; Analytical Ownership</p>\n<ul>\n<li>Own and evolve marketing data infrastructure, including Salesforce and integrated MarTech systems</li>\n<li>Write and optimize SQL queries to analyze campaign performance, pipeline generation, and revenue attribution</li>\n<li>Build and maintain scalable dashboards and reporting frameworks across:</li>\n<li>Performance marketing</li>\n<li>BDR outbound</li>\n<li>Field &amp; event marketing</li>\n<li>Develop multi-touch attribution models and conversion tracking methodologies</li>\n<li>Analyze RoAS, LTV/CAC, and funnel efficiency to inform investment decisions</li>\n</ul>\n<p><strong>Complexity and Impact of Work:</strong></p>\n<p>Systems &amp; Data Architecture</p>\n<ul>\n<li>Serve as Salesforce admin for marketing, ensuring clean data structures, campaign tracking, and attribution integrity</li>\n<li>Assess and optimize the marketing tech stack, identifying gaps and opportunities for automation and scale</li>\n<li>Partner closely with Data teams to ensure data consistency and governance across systems</li>\n</ul>\n<p><strong>Organizational Knowledge:</strong></p>\n<ul>\n<li>Produce executive-ready dashboards, reports, and presentations that clearly communicate marketing performance and ROI</li>\n<li>Provide actionable insights to support budget allocation, channel strategy, and campaign prioritization</li>\n<li>Conduct deep-dive analyses into client journeys, segmentation, and growth drivers</li>\n</ul>\n<p><strong>Communication and Influence:</strong></p>\n<ul>\n<li>Act as a key partner to Marketing, Sales, BDR, and Relationship Management teams</li>\n<li>Translate business needs into data requirements and technical solutions</li>\n<li>Influence stakeholders through clear communication, strong storytelling, and data-backed recommendations</li>\n</ul>\n<p><strong>What Sets You Apart:</strong></p>\n<ul>\n<li>Deep understanding of the institutional crypto landscape and client lifecycle</li>\n<li>Experience evaluating and implementing MarTech tools and integrations</li>\n<li>Ability to move fluidly between technical execution and strategic thinking</li>\n<li>Strong storytelling skills,turning data into clear, compelling narratives for executives</li>\n</ul>\n<p><strong>You may be a fit for this role if you have:</strong></p>\n<ul>\n<li>2–3+ years of Salesforce experience supporting marketing or GTM teams (admin-level ownership preferred)</li>\n<li>Strong hands-on experience with SQL and data analysis (required)</li>\n<li>Experience building marketing dashboards, attribution models, and performance reporting systems</li>\n<li>Familiarity with institutional crypto, fintech, or financial services and how GTM operates in these environments</li>\n<li>A track record of building or improving data infrastructure and analytics frameworks</li>\n<li>A scrappy, ownership-driven mindset,comfortable wearing multiple hats and operating in a fast-paced environment</li>\n</ul>\n<p><strong>Although not a requirement, bonus points if:</strong></p>\n<ul>\n<li>Experience with data warehousing solutions (e.g., Snowflake, BigQuery, Redshift)</li>\n<li>Exposure to BI tools (e.g., Looker, Tableau, Hex)</li>\n<li>You were emotionally moved by the soundtrack to Hamilton, which chronicles the founding of a new financial system. :)</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_651a6835-81f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anchorage Digital","sameAs":"https://anchorage.com","logo":"https://logos.yubhub.co/anchorage.com.png"},"x-apply-url":"https://jobs.lever.co/anchorage/0a8757c6-e3e9-42d5-aa15-9b779f2e8c16","x-work-arrangement":"remote","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Advanced SQL proficiency","Salesforce expertise","Marketing data architecture ownership","Analytical rigor + creative problem-solving","Cross-functional leadership"],"x-skills-preferred":["Data warehousing solutions","BI tools","Institutional crypto landscape and client lifecycle","MarTech tools and integrations","Strong storytelling skills"],"datePosted":"2026-04-17T12:23:17.998Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Marketing","industry":"Finance","skills":"Advanced SQL proficiency, Salesforce expertise, Marketing data architecture ownership, Analytical rigor + creative problem-solving, Cross-functional leadership, Data warehousing solutions, BI tools, Institutional crypto landscape and client lifecycle, MarTech tools and integrations, Strong storytelling skills"}]}