{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/governance-principles"},"x-facet":{"type":"skill","slug":"governance-principles","display":"Governance Principles","count":3},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3fa0b80f-842"},"title":"Staff Software Engineer, Public Sector","description":"<p>Job Title: Staff Software Engineer, Public Sector</p>\n<p>We are seeking a highly skilled Staff Software Engineer to join our Public Sector team. As a Staff Software Engineer, you will be responsible for designing and implementing software solutions for the public sector. You will work closely with cross-functional teams to develop and deploy software applications that meet the needs of government agencies.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Design and implement software solutions for the public sector</li>\n<li>Work closely with cross-functional teams to develop and deploy software applications</li>\n<li>Collaborate with stakeholders to understand their needs and develop software solutions that meet those needs</li>\n<li>Develop and maintain software documentation</li>\n<li>Participate in code reviews and ensure that code meets quality standards</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s degree in Computer Science or related field</li>\n<li>5+ years of experience in software development</li>\n<li>Proficiency in programming languages such as Java, Python, or C++</li>\n<li>Experience with Agile development methodologies</li>\n<li>Strong understanding of software design patterns and principles</li>\n<li>Excellent communication and collaboration skills</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Master&#39;s degree in Computer Science or related field</li>\n<li>10+ years of experience in software development</li>\n<li>Experience with cloud-based technologies such as AWS or Azure</li>\n<li>Experience with DevOps practices</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Competitive salary and benefits package</li>\n<li>Opportunities for professional growth and development</li>\n<li>Collaborative and dynamic work environment</li>\n</ul>\n<p>Salary Range: $252,000-$362,000 USD</p>\n<p>Required Skills:</p>\n<ul>\n<li>Full Stack Development</li>\n<li>Cloud-Native Technologies</li>\n<li>Data Engineering</li>\n<li>AI Application Integration</li>\n<li>Problem Solving</li>\n<li>Collaboration and Communication</li>\n<li>Adaptability and Learning Agility</li>\n</ul>\n<p>Preferred Skills:</p>\n<ul>\n<li>Experience with modern web development frameworks</li>\n<li>Familiarity with cloud platforms</li>\n<li>Understanding of containerization and container orchestration</li>\n<li>Knowledge of ETL processes</li>\n<li>Understanding of data modeling, data warehousing, and data governance principles</li>\n<li>Familiarity with integrating Large Language Models</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3fa0b80f-842","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale","sameAs":"https://www.scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4674913005","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$252,000-$362,000 USD","x-skills-required":["Full Stack Development","Cloud-Native Technologies","Data Engineering","AI Application Integration","Problem Solving","Collaboration and Communication","Adaptability and Learning Agility"],"x-skills-preferred":["Experience with modern web development frameworks","Familiarity with cloud platforms","Understanding of containerization and container orchestration","Knowledge of ETL processes","Understanding of data modeling, data warehousing, and data governance principles","Familiarity with integrating Large Language Models"],"datePosted":"2026-04-18T16:00:27.694Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA; St. Louis, MO; New York, NY; Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Full Stack Development, Cloud-Native Technologies, Data Engineering, AI Application Integration, Problem Solving, Collaboration and Communication, Adaptability and Learning Agility, Experience with modern web development frameworks, Familiarity with cloud platforms, Understanding of containerization and container orchestration, Knowledge of ETL processes, Understanding of data modeling, data warehousing, and data governance principles, Familiarity with integrating Large Language Models","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":252000,"maxValue":362000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_16d6fa31-6b8"},"title":"Senior IT Engineer, AI Enablement","description":"<p>Job Title: Senior IT Engineer, AI Enablement\\n\\nWe are seeking a highly skilled Senior IT Engineer to join our team in enabling AI capabilities across the organization. As a Senior IT Engineer, AI Enablement, you will be responsible for building and expanding Omada&#39;s MCP ecosystem, connecting SaaS tools and internal systems via MCP servers, writing skills, composing tool bundles for different teams, and deploying them to the right people.\\n\\n<strong>Responsibilities:</strong>\\n\\n<em> Serve as the primary point of contact for department champions across Omada, working with them to understand their day-to-day workflows and identify where AI-assisted automation creates real leverage.\\n</em> Drive AI adoption across the company by deploying tool bundles that are actually useful, making sure integrations fit how teams work rather than asking teams to adapt to what&#39;s technically convenient.\\n<em> Run a continuous feedback loop: gather usage signals and qualitative input from champions, identify what&#39;s working and what&#39;s falling flat, and iterate accordingly.\\n</em> Build lightweight documentation, reference examples, and enablement materials that help non-technical stakeholders understand what&#39;s possible and how to request new capabilities.\\n<em> Represent the needs of end users and champions when making integration decisions; you are their voice in the build process.\\n\\n<strong>MCP Integration &amp; Build:</strong>\\n\\n</em> Integrate SaaS and internal applications with Omada&#39;s MCP ecosystem by building and configuring MCP servers, adding tools, and handling authentication patterns including OAuth and webhooks.\\n<em> Write skills that expose the right actions and data to AI agents in a clear, composable way.\\n</em> Compose skills and tools into role-appropriate bundles, scoped to what each team and function actually needs, not everything at once.\\n<em> Deploy tool bundles to ABAC groups, managing access so the right people get the right capabilities without overprovisioning.\\n</em> Partner with the Senior IT Engineers, Automation on integrations that span MCP and workflow automation, ensuring handoffs and shared patterns are consistent.\\n<em> Participate in design reviews for new MCP integrations to catch potential issues early and keep the ecosystem coherent.\\n</em> Teach and mentor IT team members as you go. Be the SMEs that help us understand and internalize this tech.\\n\\n<strong>Governance &amp; Responsible Building:</strong>\\n\\n<em> Build MCP servers and skills that follow least-privilege principles from day one, scoping access to what an integration actually needs, and nothing more.\\n</em> Contribute to Omada&#39;s standards and policies for MCP server onboarding, skill review, and access governance, as a practitioner who cares about getting it right, not as a compliance gatekeeper.\\n<em> Ensure integrations handle data appropriately given Omada&#39;s health data environment. Understand what data flows where, flag concerns early, and work with Security and Compliance when review is warranted.\\n</em> Maintain audit-friendly integration configurations so that security and compliance teams have the visibility they need without heroic effort on their part.\\n<em> Collaborate with Security on risk assessment for high-sensitivity integrations, and translate security requirements into practical implementation decisions.\\n\\n<strong>Platform Craft:</strong>\\n\\n</em> Maintain a working knowledge of the MCP control plane configuration and capabilities so you can ship integrations efficiently and troubleshoot confidently.\\n<em> Identify gaps in the current integration library and propose a prioritized roadmap for new MCP servers and skills, informed by champion feedback and team-level demand.\\n</em> Contribute to reusable patterns, shared templates, and internal documentation that raise the quality bar for everyone building on the platform.\\n<em> Stay current on the MCP ecosystem, agentic frameworks, and adjacent tooling. Bring relevant innovations back to the team.\\n\\n<strong>What Great Looks Like:</strong>\\n\\n</em> Ships new MCP integrations and skill bundles regularly, moves from &quot;teams want this capability&quot; to &quot;teams are using this capability&quot; with speed and confidence.\\n<em> Earns trust with department champions by listening carefully, delivering on commitments, and iterating when something isn&#39;t quite right.\\n</em> Builds integrations that hold up over time: well-scoped permissions, thoughtful data handling, clear documentation. Don&#39;t just build POCs that work on day one.\\n<em> Operates autonomously but communicates proactively: stakeholders always know what&#39;s in progress, what&#39;s blocked, and what&#39;s coming next.\\n</em> Thinks about the whole adoption curve, not just the technical implementation. Considers onboarding, training, and feedback from the start.\\n<em> Demonstrates strong judgment about when to move fast and when to slow down and involve Security or Compliance.\\n</em> Influences how the team builds by contributing ideas, patterns, and standards that make the MCP ecosystem better for everyone who relies on it.\\n<em> Measures their own success by whether teams are actually using what was built, and digs in when adoption isn&#39;t happening.\\n\\n<strong>Candidate Requirements:</strong>\\n\\n</em> Hands-on experience building with MCP; configuring or authoring MCP servers, connecting tools, writing skills, and working with agentic frameworks.\\n<em> Strong SaaS API integration experience: REST, webhooks, OAuth, and the practical realities of connecting enterprise applications reliably.\\n</em> Understanding of ABAC, access control, and governance principles for AI/LLM deployments, including how to apply least-privilege in practice.\\n<em> Expertise working directly with non-technical stakeholders, understanding their needs, translating them into technical solutions, and maintaining the relationship through iteration.\\n</em> Demonstrated track record of driving tool adoption, developer enablement, or similar change; not just building things, but getting people to use them.\\n<em> Ability to write and maintain integration code (Python or similar). Comfortable authoring and debugging scripts, not just configuring UIs.\\n</em> 7+ years of experience in a systems integration, developer enablement, internal tooling, or closely related role.\\n* Strong communication skills; able to explain technical decisions and trade-offs clearly to audiences ranging from developers to non-technical stakeholders.\\n\\n</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_16d6fa31-6b8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Omada Health","sameAs":"https://www.omadahealth.com/","logo":"https://logos.yubhub.co/omadahealth.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/omadahealth/jobs/7800365","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["MCP","SaaS API integration","REST","webhooks","OAuth","ABAC","access control","governance principles","least-privilege","agentic frameworks","Python","integration code","systems integration","developer enablement","internal tooling"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:50:31.497Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote, USA"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"MCP, SaaS API integration, REST, webhooks, OAuth, ABAC, access control, governance principles, least-privilege, agentic frameworks, Python, integration code, systems integration, developer enablement, internal tooling"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_56dc9a51-e66"},"title":"Principal Consultant - Data Architecture","description":"<p><strong>Principal Consultant - Data Architecture</strong></p>\n<p>You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>\n<p><strong>About Your Role</strong></p>\n<p>As a Principal Data Architecture Consultant, you will act as a senior technical leader in complex data and analytics engagements. You will shape and govern end-to-end enterprise data architectures, lead technical teams, and serve as a trusted technical advisor for clients and internal stakeholders.</p>\n<p><strong>Your Role Will Include:</strong></p>\n<ul>\n<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>\n<li>Translate business objectives into scalable, secure, and compliant data solutions</li>\n<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>\n<li>Guide delivery teams through implementation, rollout, and production readiness</li>\n<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>\n<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>\n<li>Support pre-sales and solution design activities from a technical perspective</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>\n<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>\n<li>Strong client-facing experience in complex enterprise environments</li>\n</ul>\n<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>\n<ul>\n<li>Strong expertise in modern data architectures, including:</li>\n<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>\n<li>Modern Data Architecture design principles</li>\n<li>Batch and streaming data integration patterns</li>\n<li>Data Platform, DevOps, deployment and security architectures</li>\n<li>Analytics and AI enablement architectures</li>\n<li>Hands-on experience with cloud data platforms, e.g.:</li>\n<li>Azure, AWS or GCP</li>\n<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>\n<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>\n<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>\n<li>Solid understanding of API-based and event-driven architectures</li>\n<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>\n</ul>\n<p><strong>Engineering &amp; Platform Foundations</strong></p>\n<ul>\n<li>Experience with data pipelines, orchestration, and automation</li>\n<li>Familiarity with CI/CD concepts and production-grade deployments</li>\n<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>\n</ul>\n<p><strong>Data Management &amp; Governance</strong></p>\n<ul>\n<li>Strong understanding of data management and governance principles, including:</li>\n<li>Data quality, metadata, lineage, master data management</li>\n<li>Data Management software and tools</li>\n<li>Security, access control, and compliance considerations</li>\n<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>\n</ul>\n<p><strong>Nice to Have</strong></p>\n<ul>\n<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>\n<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>\n<li>Hands-on Experience with data governance or metadata tools</li>\n<li>Cloud, data, or architecture certifications</li>\n</ul>\n<p><strong>Language &amp; Mobility</strong></p>\n<ul>\n<li>Very good English skills</li>\n<li>Willingness to travel for project-related work</li>\n</ul>\n<p><strong>Benefits</strong></p>\n<p>You will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>\n<p><strong>About Infosys Consulting</strong></p>\n<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>\n<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>\n<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>\n<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_56dc9a51-e66","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Infosys Consulting - Europe","sameAs":"https://jobs.workable.com","logo":"https://logos.yubhub.co/view.com.png"},"x-apply-url":"https://jobs.workable.com/view/hpBWjvvy8D6B1f818cHxZR/remote-principal-consultant---data-architecture-in-poland-at-infosys-consulting---europe","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["enterprise data architecture","system data integration","data engineering","analytics","modern data architectures","Data Mesh/ Data Fabric/ Data lake / data warehouse architectures","Modern Data Architecture design principles","Batch and streaming data integration patterns","Data Platform, DevOps, deployment and security architectures","Analytics and AI enablement architectures","cloud data platforms","Azure","AWS","GCP","Databricks","Snowflake","BigQuery","Azure Synapse / Microsoft Fabric","SQL","relational databases","Postgres","SQL Server","Oracle","NoSQL databases","Cosmos DB","MongoDB","InfluxDB","API-based and event-driven architectures","data migration programmes","data pipelines","orchestration","automation","CI/CD concepts","production-grade deployments","distributed systems","Docker","Kubernetes","data management and governance principles","data quality","metadata","lineage","master data management","data management software and tools","security","access control","compliance considerations","Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience"],"x-skills-preferred":["advanced analytics","AI / ML or GenAI","streaming platforms","Kafka","Azure Event Hubs","data governance or metadata tools","cloud","data","architecture certifications"],"datePosted":"2026-03-09T16:51:22.857Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Poland"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"enterprise data architecture, system data integration, data engineering, analytics, modern data architectures, Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, cloud data platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, SQL, relational databases, Postgres, SQL Server, Oracle, NoSQL databases, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, data migration programmes, data pipelines, orchestration, automation, CI/CD concepts, production-grade deployments, distributed systems, Docker, Kubernetes, data management and governance principles, data quality, metadata, lineage, master data management, data management software and tools, security, access control, compliance considerations, Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience, advanced analytics, AI / ML or GenAI, streaming platforms, Kafka, Azure Event Hubs, data governance or metadata tools, cloud, data, architecture certifications"}]}