<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>19ef76c6-d81</externalid>
      <Title>Research Engineer / Scientist (3D Reconstruction)</Title>
      <Description><![CDATA[<p>We&#39;re looking for a 3D Reconstruction Specialist to develop and advance state-of-the-art methods for reconstructing high-quality 3D geometry and appearance from real-world data. This role is focused on modern reconstruction techniques,both feed-forward and optimization-based,with an emphasis on novel representations, robust optimization, and scalable training and inference pipelines.</p>
<p>This is a hands-on, research-driven role for someone who enjoys working at the intersection of computer vision, graphics, and machine learning. You&#39;ll collaborate closely with research scientists, ML engineers, and product teams to translate cutting-edge reconstruction ideas into production-ready systems that power core product capabilities.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Design and implement modern 3D reconstruction systems, including feed-forward and optimization-based approaches for geometry, appearance, and scene understanding.</li>
<li>Research, prototype, and productionize advanced 3D representations (e.g., implicit functions, point-based or volumetric methods, hybrid representations) with a focus on accuracy, efficiency, and scalability.</li>
<li>Develop and improve optimization pipelines for multi-view reconstruction, including camera pose estimation, joint geometry/appearance optimization, and robust loss formulations.</li>
<li>Build end-to-end training and evaluation workflows for 3D reconstruction models, from data preparation and supervision strategies to large-scale experiments and metrics.</li>
<li>Collaborate with data and infrastructure teams to ensure reconstruction methods integrate cleanly with existing 3D data pipelines, rendering systems, and downstream applications.</li>
<li>Analyze failure modes and data quality issues in real-world reconstruction scenarios, and design principled solutions to improve robustness and generalization.</li>
<li>Optimize performance across the stack, including memory usage, training speed, and inference latency, to support large-scale datasets and production constraints.</li>
<li>Contribute to technical direction by proposing new research ideas, mentoring teammates, and helping set best practices for 3D reconstruction across the organization.</li>
</ul>
<p><strong>Key Qualifications:</strong></p>
<ul>
<li>6+ years of experience working on 3D reconstruction, multi-view geometry, or related areas in computer vision, graphics, or machine learning.</li>
<li>Strong foundation in modern 3D reconstruction techniques, including feed-forward neural methods or optimization-based approaches.</li>
<li>Deep experience with 3D representations and their tradeoffs (e.g., implicit fields, point-based methods, meshes, volumes) or with large-scale optimization pipelines for reconstruction.</li>
<li>Proficiency in Python and/or C++, with hands-on experience building research or production systems.</li>
<li>Experience with deep learning frameworks (e.g., PyTorch) and numerical optimization tools.</li>
<li>Familiarity with rendering, differentiable rendering, or graphics pipelines, and how they interact with reconstruction systems.</li>
<li>Proven ability to work in ambiguous, fast-moving environments and drive projects from concept through deployment.</li>
<li>A strong sense of ownership and scientific rigor: you care deeply about correctness, reproducibility, and measurable improvements.</li>
<li>Enjoy collaborating with a small, high-caliber team and raising the technical bar through thoughtful design, experimentation, and code quality.</li>
</ul>
<p><strong>Who You Are:</strong></p>
<ul>
<li>Fearless Innovator: We need people who thrive on challenges and aren&#39;t afraid to tackle the impossible.</li>
<li>Resilient Builder: Impacting Large World Models isn&#39;t a sprint; it&#39;s a marathon with hurdles. We&#39;re looking for builders who can weather the storms of groundbreaking research and come out stronger.</li>
<li>Mission-Driven Mindset: Everything we do is in service of creating the best spatially intelligent AI systems, and using them to empower people.</li>
<li>Collaborative Spirit: We&#39;re building something bigger than any one person. We need team players who can harness the power of collective intelligence.</li>
</ul>
<p>We&#39;re hiring the brightest minds from around the globe to bring diverse perspectives to our cutting-edge work. If you&#39;re ready to work on technology that will reshape how machines perceive and interact with the world, World Labs is your launchpad.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$250,000-$350,000 base salary (good-faith estimate for San Francisco Bay Area upon hire; actual offer based on experience, skills, and qualifications)</Salaryrange>
      <Skills>3D reconstruction, multi-view geometry, computer vision, graphics, machine learning, Python, C++, PyTorch, numerical optimization tools, rendering, differentiable rendering, graphics pipelines</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>World Labs</Employername>
      <Employerlogo>https://logos.yubhub.co/worldlabs.ai.png</Employerlogo>
      <Employerdescription>World Labs builds foundational world models that can perceive, generate, reason, and interact with the 3D world.</Employerdescription>
      <Employerwebsite>https://worldlabs.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/worldlabs/jobs/4113005009</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
  </jobs>
</source>