{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/data-mart"},"x-facet":{"type":"skill","slug":"data-mart","display":"Data Mart","count":4},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_eca46b5f-481"},"title":"IT Service Management Analyst","description":"<p>The IT Service Management Analyst is responsible for the strategy development, operation and support of enterprise-wide service management functions, including asset, incident, change, problem, service request, knowledge, and configuration management processes and operational analytics/BI/reporting functions. This position is responsible for developing and supporting TO metrics and reporting to monitor process/tool adoption and maturity levels, technical ability to pull data from a variety of sources, consolidating and analyzing the data to identify potential areas for improvement, leading projects to identify and execute improvements in processes and tools, and working with service owners and leaders across the technology organization to identify and publish actionable service metrics that drive continuous service improvement.</p>\n<p>This role requires a strong analytical and problem-solving skillset, with the ability to work with multiple stakeholders and prioritize tasks effectively. The ideal candidate will have a strong understanding of IT service management principles and practices, as well as experience with data analysis and reporting tools such as PowerBI and Excel.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design and develop metrics and reports using data from Helix, and other data sources</li>\n<li>Assist Service Management Analysts and technology partners in defining requirements for metrics and reports</li>\n<li>Support and enhance existing metrics and reports</li>\n<li>Support and enhance the Helix (SQL server) data mart</li>\n<li>Act as a liaison to the Data and Analytics reporting team for support and requirements</li>\n<li>Analyze data for trends, root cause, and performance issues</li>\n<li>Validate data accuracy</li>\n<li>Identify and troubleshoot data issues</li>\n<li>Work with vendors as appropriate to resolve data issues or discuss requirements</li>\n<li>Identify metrics and reports that will help improve TO processes</li>\n<li>Track and report status of metrics and reporting projects</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_eca46b5f-481","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Georgia Power","sameAs":"https://www.georgiapower.com/","logo":"https://logos.yubhub.co/georgiapower.com.png"},"x-apply-url":"https://emje.fa.us6.oraclecloud.com/hcmUI/CandidateExperience/en/sites/SouthernCompanyJobs/job/17795","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Advanced T-SQL","Postgres","Oracle","PowerBI","Data Analysis Expressions (DAX)","Excel data analysis","pivot tables","slicers","charting","data analysis","VBA macros","CMS","relational databases","BMC Helix","ITIL Foundation certification"],"x-skills-preferred":["Business Intelligence analytics tools","data marts","data mining skills","ITIL","IT Service Management frameworks","process improvement"],"datePosted":"2026-04-03T08:37:03.241Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Atlanta"}},"employmentType":"FULL_TIME","occupationalCategory":"IT","industry":"Energy","skills":"Advanced T-SQL, Postgres, Oracle, PowerBI, Data Analysis Expressions (DAX), Excel data analysis, pivot tables, slicers, charting, data analysis, VBA macros, CMS, relational databases, BMC Helix, ITIL Foundation certification, Business Intelligence analytics tools, data marts, data mining skills, ITIL, IT Service Management frameworks, process improvement"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_91afbbda-8cd"},"title":"Backend Engineer, Data","description":"<p>We&#39;re looking for a talented Backend Engineer, Data to join our Data Foundations team. As a Backend Engineer, Data, you will design, develop, and own data pipelines, models, and products that power the Product, Data Science, and GTM functions. You will work with a variety of internal teams across Product, Data Science, and GTM to help them solve their data needs. Your work will provide visibility into how these stakeholders and the Data Foundations organization are performing and how we can deliver a better experience to Stripe&#39;s customers.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design, develop, and own data pipelines, models, and products that power the Product, Data Science, and GTM functions</li>\n<li>Develop strong subject matter expertise and manage the SLAs for both data pipelines and full stack web applications that support these critical stakeholders</li>\n<li>Build and refine Stripe&#39;s data foundations - infrastructure, pipelines, and tools to enable various teams at Stripe - working with Scala, Spark, and Airflow</li>\n<li>Leverage LLM and Agents at scale to produce high-quality data on ambiguous problems</li>\n<li>Refine our existing data marts that help the GTM organization forecast the future potential performance of the business and reliably measure ongoing attainment toward targets</li>\n<li>Build data services that track key product metrics and measure the impact of different strategies employed by teams in the field</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Must have 6+ years of experience in a Software Engineering role, with a focus on building and maintaining data services, or data-intensive applications</li>\n<li>A strong engineering background and are interested in data</li>\n<li>Prior experience with writing and debugging data pipelines using a distributed data framework (Spark / Hadoop / Pig etc)</li>\n<li>An inquisitive nature in diving into data inconsistencies to pinpoint issues, and resolve deep rooted data quality issues</li>\n<li>Knowledge of a backend development language (such as Scala, Java, or Go) and strong SQL experience</li>\n<li>The ability to communicate cross-functionally, derive requirements and architect shared datasets</li>\n</ul>\n<p>Preferred Requirements:</p>\n<ul>\n<li>Experience creating and maintaining Data Marts to power business reporting needs</li>\n<li>Experience working with Product or GTM (Sales/Marketing) teams</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_91afbbda-8cd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Stripe","sameAs":"https://stripe.com/","logo":"https://logos.yubhub.co/stripe.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/stripe/jobs/6865161","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Scala","Spark","Airflow","LLM","Agents","SQL","Java","Go"],"x-skills-preferred":["Data Marts","Product","GTM","Sales","Marketing"],"datePosted":"2026-03-31T18:01:28.144Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Canada"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Scala, Spark, Airflow, LLM, Agents, SQL, Java, Go, Data Marts, Product, GTM, Sales, Marketing"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_aa015612-5ff"},"title":"Product & Solutions Lead, Safety and Security","description":"<p><strong>Job Posting</strong></p>\n<p><strong>Product &amp; Solutions Lead, Safety and Security</strong></p>\n<p><strong>Location</strong></p>\n<p>San Francisco</p>\n<p><strong>Employment Type</strong></p>\n<p>Full time</p>\n<p><strong>Department</strong></p>\n<p>Intelligence &amp; Investigations</p>\n<p><strong>Compensation</strong></p>\n<ul>\n<li>$288K – $425K • Offers Equity</li>\n</ul>\n<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>\n<ul>\n<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>\n</ul>\n<ul>\n<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>\n</ul>\n<ul>\n<li>401(k) retirement plan with employer match</li>\n</ul>\n<ul>\n<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>\n</ul>\n<ul>\n<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>\n</ul>\n<ul>\n<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>\n</ul>\n<ul>\n<li>Mental health and wellness support</li>\n</ul>\n<ul>\n<li>Employer-paid basic life and disability coverage</li>\n</ul>\n<ul>\n<li>Annual learning and development stipend to fuel your professional growth</li>\n</ul>\n<ul>\n<li>Daily meals in our offices, and meal delivery credits as eligible</li>\n</ul>\n<ul>\n<li>Relocation support for eligible employees</li>\n</ul>\n<ul>\n<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>\n</ul>\n<p>More details about our benefits are available to candidates during the hiring process.</p>\n<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>\n<p><strong>About the Team</strong></p>\n<p>The Intelligence &amp; Investigations (I2) team detects and disrupts abuse and strategic risks so people can use AI safely. We translate real-world signals, investigations, and external threat intelligence into practical mitigations, operating guidance, and partner-ready support that improves safety outcomes across the AI ecosystem.</p>\n<p><strong>About the Role</strong></p>\n<p>As a Product &amp; Solutions Lead focused on safety and security, you will build and operate 0–1 products, services, and technical solution packages that help developers and public institutions move from experimentation to durable, trusted outcomes—while maintaining public safety, transparency, and respect for privacy and rights.</p>\n<p>This role balances two modes of delivery:</p>\n<ol>\n<li>Bespoke products and technical solutions for strategic internal and external partners, and</li>\n</ol>\n<ol>\n<li>Scalable product and solution packages that can be reused broadly across partners and deployments.</li>\n</ol>\n<p>Training is a component of scale, but not the center of gravity. You will also ship reference implementations, playbooks, evaluation kits, and repeatable operating models that partners can adopt and operate.</p>\n<p>You will work directly with engineers and a multidisciplinary group of safety and geopolitical analysts, and data and quantitative scientists to convert complex, evolving challenges into solutions that teams can adopt in high-stakes environments.</p>\n<p>This role is based in San Francisco, CA (hybrid, 3 days/week). Relocation support is available.</p>\n<p><strong>In this role, you will:</strong></p>\n<ul>\n<li>Own the 0–1 roadmap for safety and security solution offerings: define the target users, problem statements, tools, operating models, success metrics, and the set of reusable deliverables we ship.</li>\n</ul>\n<ul>\n<li>Design and ship bespoke technical solutions for priority partners (internal and external), then abstract what works into reusable patterns and toolkits.</li>\n</ul>\n<ul>\n<li>Build partner-ready technical artifacts: solution blueprints, reference architectures, evaluation and monitoring guidance, incident/response playbooks, and deployment checklists.</li>\n</ul>\n<ul>\n<li>Package open-source and proprietary capabilities into adoption-ready solutions (e.g., reference implementations, configuration patterns, validated workflows).</li>\n</ul>\n<ul>\n<li>Maintain a consistent delivery model across engagements: intake, scoping, governance alignment, execution cadence, and retrospectives that improve the offering over time.</li>\n</ul>\n<ul>\n<li>Translate evolving threats into actionable guidance and updates for solution packages (e.g., scams/fraud patterns, cyber-enabled threats, ecosystem abuse trends).</li>\n</ul>\n<ul>\n<li>Develop lightweight enablement components as needed: targeted technical modules, hands-on labs, and readiness assessments that accelerate adoption of the solutions.</li>\n</ul>\n<ul>\n<li>Define and instrument impact measurement: adoption milestones, readiness indicators, reliability and safety posture improvements, and partner satisfaction with outputs.</li>\n</ul>\n<ul>\n<li>Partner closely across engineering, safety, geopolitical analysis, and quantitative teams to ensure solutions are technically credible, threat-informed, and measurable.</li>\n</ul>\n<ul>\n<li>Communicate crisply and decision-readily to internal and external stakeholders: progress, trade-offs, risks, and recommendations.</li>\n</ul>\n<p><strong>You might thrive in this role if you:</strong></p>\n<ul>\n<li>Have 6+ years in product, technical program leadership, solutions, or platform operations, especially in safety, security, risk, integrity, or enterprise/public-sector contexts.</li>\n</ul>\n<ul>\n<li>Have built 0–1 solution offerings (product plus services or productized services): taking ambiguous needs, shipping something concrete, then scaling it into a repeatable model.</li>\n</ul>\n<ul>\n<li>Have a builder’s mindset: comfortable incubating early-stage ideas, testing them with partners, and evolving them into durable, repeatable safety and security solutions.</li>\n</ul>\n<ul>\n<li>Can go deep with engineers and still produce partner-ready artifacts that are clear</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_aa015612-5ff","directApply":true,"hiringOrganization":{"@type":"Organization","name":"OpenAI","sameAs":"https://jobs.ashbyhq.com","logo":"https://logos.yubhub.co/openai.com.png"},"x-apply-url":"https://jobs.ashbyhq.com/openai/c664cc09-d996-450c-8683-ad591ac27c11","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$288K – $425K","x-skills-required":["product leadership","technical program leadership","solutions","platform operations","safety","security","risk","integrity","enterprise/public-sector contexts","product development","solution development","technical writing","communication","project management","team leadership","collaboration","problem-solving","analytical skills","data analysis","data visualization","machine learning","artificial intelligence","cybersecurity","threat intelligence","incident response","compliance","regulatory affairs"],"x-skills-preferred":["cloud computing","containerization","DevOps","agile development","scrum","kanban","continuous integration","continuous deployment","continuous testing","test automation","security testing","penetration testing","vulnerability assessment","compliance testing","regulatory testing","data protection","information security","cybersecurity frameworks","risk management","compliance management","regulatory compliance","data governance","information governance","data quality","data integrity","data validation","data verification","data certification","data assurance","data security","data encryption","data masking","data tokenization","data anonymization","data pseudonymization","data aggregation","data fusion","data integration","data warehousing","data mart","data lake","data catalog","data governance","data quality","data integrity","data validation","data verification","data certification","data assurance","data security","data encryption","data masking","data tokenization","data anonymization","data pseudonymization","data aggregation","data fusion","data integration","data warehousing","data mart","data lake","data catalog"],"datePosted":"2026-03-06T18:42:25.322Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"product leadership, technical program leadership, solutions, platform operations, safety, security, risk, integrity, enterprise/public-sector contexts, product development, solution development, technical writing, communication, project management, team leadership, collaboration, problem-solving, analytical skills, data analysis, data visualization, machine learning, artificial intelligence, cybersecurity, threat intelligence, incident response, compliance, regulatory affairs, cloud computing, containerization, DevOps, agile development, scrum, kanban, continuous integration, continuous deployment, continuous testing, test automation, security testing, penetration testing, vulnerability assessment, compliance testing, regulatory testing, data protection, information security, cybersecurity frameworks, risk management, compliance management, regulatory compliance, data governance, information governance, data quality, data integrity, data validation, data verification, data certification, data assurance, data security, data encryption, data masking, data tokenization, data anonymization, data pseudonymization, data aggregation, data fusion, data integration, data warehousing, data mart, data lake, data catalog, data governance, data quality, data integrity, data validation, data verification, data certification, data assurance, data security, data encryption, data masking, data tokenization, data anonymization, data pseudonymization, data aggregation, data fusion, data integration, data warehousing, data mart, data lake, data catalog","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":288000,"maxValue":425000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_448a56f3-ab5"},"title":"Director of Data Engineering and Agentic AI Automation, Finance","description":"<p><strong>Director of Data Engineering and Agentic AI Automation, Finance</strong></p>\n<p><strong>Location</strong></p>\n<p>San Francisco</p>\n<p><strong>Employment Type</strong></p>\n<p>Full time</p>\n<p><strong>Department</strong></p>\n<p>Finance</p>\n<p><strong>Compensation</strong></p>\n<ul>\n<li>$347K – $490K • Offers Equity</li>\n</ul>\n<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>\n</ul>\n<ul>\n<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>\n</ul>\n<ul>\n<li>401(k) retirement plan with employer match</li>\n</ul>\n<ul>\n<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>\n</ul>\n<ul>\n<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>\n</ul>\n<ul>\n<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>\n</ul>\n<ul>\n<li>Mental health and wellness support</li>\n</ul>\n<ul>\n<li>Employer-paid basic life and disability coverage</li>\n</ul>\n<ul>\n<li>Annual learning and development stipend to fuel your professional growth</li>\n</ul>\n<ul>\n<li>Daily meals in our offices, and meal delivery credits as eligible</li>\n</ul>\n<ul>\n<li>Relocation support for eligible employees</li>\n</ul>\n<ul>\n<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>\n</ul>\n<p><strong>About the Team</strong></p>\n<p>We are looking for a Director of Data Engineering and Agentic AI Automation to lead the next generation of our finance data infrastructure. As OpenAI expands its Finance operations, we need scalable and trustworthy data systems to match the pace and complexity of our growth. This includes well-modeled, auditable data for revenue recognition, financial reporting, and planning, supported by reliable pipelines that connect ERP, planning, and operational systems. You will lead a group of analytics engineers, data engineers, and AI engineers to build the data pipelines that connect our internal engineering systems with enterprise platforms such as Oracle Fusion ERP. This role will also define the roadmap for agentic AI automation, enabling intelligent workflows, process automation, and AI-driven decision-making across Finance.</p>\n<p><strong>In this role, you will:</strong></p>\n<ul>\n<li>Build and maintain scalable, auditable data infrastructure that powers accurate financial information, with a focus on revenue recognition, compute attribution, and close automation.</li>\n</ul>\n<ul>\n<li>Lead and grow teams of analytics engineers, data engineers, and AI engineers to deliver high-impact, intelligent data systems.</li>\n</ul>\n<ul>\n<li>Guide work across financial close and allocations automation, B2C revenue automation from engineering systems to ERP (including reconciliation with cash and source systems), and other mission-critical financial processes.</li>\n</ul>\n<ul>\n<li>Design and implement data pipelines connecting ERP, planning, and operational systems, including Oracle Fusion, Anaplan, and Workday.</li>\n</ul>\n<ul>\n<li>Build and support scalable, audit-proof architecture that enables reliable financial reporting and compliance.</li>\n</ul>\n<ul>\n<li>Develop data and AI-powered workflows that enhance forecasting accuracy, compliance automation, and operational efficiency.</li>\n</ul>\n<ul>\n<li>Create and maintain data marts and products that support stakeholders across Revenue, FP&amp;A, Tax, Procurement, Hardware Accounting, and Controller teams.</li>\n</ul>\n<ul>\n<li>Define and enforce best practices for data modeling, lineage, observability, and reconciliation across finance data domains.</li>\n</ul>\n<ul>\n<li>Set the technical direction and manage team structure, mentoring engineers and overseeing contractors or system integrators to ensure delivery of high-quality outcomes.</li>\n</ul>\n<ul>\n<li>Partner with senior leaders across Finance, Engineering, and Infrastructure to align on priorities and integrate new automation capabilities.</li>\n</ul>\n<ul>\n<li>Ensure data systems are AI-ready and capable of supporting predictive analytics, autonomous agent workflows, and large-scale automation.</li>\n</ul>\n<ul>\n<li>Own and maintain Tier-1 data pipelines with strict SLA, data quality, and compliance standards.</li>\n</ul>\n<ul>\n<li>Drive the long-term roadmap for agentic AI enablement to build the foundation for “Finance on OpenAI.”</li>\n</ul>\n<p><strong>You might thrive in this role if you have:</strong></p>\n<ul>\n<li>12+ years in data engineering, with proven experience building and managing enterprise-scale, auditable ETL pipelines and complex datasets</li>\n</ul>\n<ul>\n<li>Proficiency in SQL and Python, with demonstrated experience in schema design, data modeling, and orchestration frameworks</li>\n</ul>\n<ul>\n<li>Expertise in distributed data processing technologies such as Apache Spark, Kafka, and cloud-native storage (e.g., S3, ADLS)</li>\n</ul>\n<ul>\n<li>Deep knowledge of enterprise data architecture, especially within Finance and Supply Chain</li>\n</ul>\n<ul>\n<li>Familiarity with financial processes (close, allocations, revenue recognition) and supply chain data models (Supply and demand planning, procurement, vendor master), along with experience in ingesting data from internal engineering systems with large volumes of B2C</li>\n</ul>\n<ul>\n<li>Experience integrating with contract manufacturers and external logistics providers is a strong plus</li>\n</ul>\n<ul>\n<li>Strong track record of partnering with senior business stakeholders</li>\n</ul>\n<p><strong>Work Environment</strong></p>\n<p>This role is based in San Francisco, CA. We use a hybrid work model of 3 days in the office per week and offer relocation assistance to new employees.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_448a56f3-ab5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"OpenAI","sameAs":"https://jobs.ashbyhq.com","logo":"https://logos.yubhub.co/openai.com.png"},"x-apply-url":"https://jobs.ashbyhq.com/openai/e84e7b7e-a82e-411e-929a-615dc3080280","x-work-arrangement":"hybrid","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$347K – $490K • Offers Equity","x-skills-required":["SQL","Python","Apache Spark","Kafka","cloud-native storage","data modeling","orchestration frameworks","distributed data processing technologies","enterprise data architecture","financial processes","supply chain data models"],"x-skills-preferred":["ETL pipelines","complex datasets","schema design","data engineering","data infrastructure","auditable data","revenue recognition","financial reporting","planning","ERP","planning","operational systems","Oracle Fusion","Anaplan","Workday","data marts","products","stakeholders","Revenue","FP&A","Tax","Procurement","Hardware Accounting","Controller","data modeling","lineage","observability","reconciliation","finance data domains","team structure","engineers","contractors","system integrators","predictive analytics","autonomous agent workflows","large-scale automation"],"datePosted":"2026-03-06T18:27:50.931Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"SQL, Python, Apache Spark, Kafka, cloud-native storage, data modeling, orchestration frameworks, distributed data processing technologies, enterprise data architecture, financial processes, supply chain data models, ETL pipelines, complex datasets, schema design, data engineering, data infrastructure, auditable data, revenue recognition, financial reporting, planning, ERP, planning, operational systems, Oracle Fusion, Anaplan, Workday, data marts, products, stakeholders, Revenue, FP&A, Tax, Procurement, Hardware Accounting, Controller, data modeling, lineage, observability, reconciliation, finance data domains, team structure, engineers, contractors, system integrators, predictive analytics, autonomous agent workflows, large-scale automation","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":347000,"maxValue":490000,"unitText":"YEAR"}}}]}