The movie Toy Story needed top-computers in 1995 to render every frame and that took a lot of time (800000 machine-hours according to Wikipedia).

Could it be possible to render it in real time with modern (2025) GPUs on a single home computer?

  • Emily (she/her)@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    32
    arrow-down
    2
    ·
    edit-2
    2 days ago

    I’m not a computer graphics expert (though have at least a little experience with video game dev), but considering Toy Story uses ray-traced lighting I would say it at least depends on whether you have a ray-tracing capable GPU. If you don’t, probably not. I would guess you could get something at least pretty close out of a modern day game engine otherwise.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        13
        ·
        2 days ago

        Full raytracing (path tracing) might be.

        I think they can do it for very basic looking games like Quake 2.

        That said, I doubt you’d actually need full RT for visuals like Toy Story 1. Or indeed on most things.

        They got pretty good at faking most of it. RT can basically just be used for reflections, shadows and global illumination and most of us wouldn’t notice the difference.

      • Emily (she/her)@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        13
        ·
        edit-2
        2 days ago

        Maybe, what I said is admittedly mostly based on the experience I have with Blender’s Cycles renderer, which is definitely not real time.

    • magic_lobster_party@fedia.io
      link
      fedilink
      arrow-up
      18
      ·
      2 days ago

      Did Toy Story use ray tracing back then?

      AFAIK, A Bug’s Life is the first Pixar movie that used ray tracing to some extent, and that was for a few reflections. Monster’s University is the first Pixar movie that was fully ray traced.

      • Emily (she/her)@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        11
        ·
        2 days ago

        You’re right, it looks like they didn’t (at least for most things?). They do mention raytracing briefly, and that the sampling stage can “combine point samples from this algorithm with point samples from other algorithms that have capabilities such as ray tracing”, but it seems like they describe something like shadow mapping for shadows and regular raster shading techniques (“textures have also been used for refractions and shadows”)?

        • magic_lobster_party@fedia.io
          link
          fedilink
          arrow-up
          4
          ·
          2 days ago

          Interesting paper. I skimmed through it quickly, but it seems like they wanted to avoid relying on ray tracing.

          Minimal ray tracing. Many non-local lighting effects can be approximated with texture maps. Few objects in natural scenes would seem to require ray tracing. Accordingly, we consider it more important to optimize the architecture for complex geometries and large models than for the non-local lighting effects accounted for by ray tracing or radiosity.

          Most of the paper is way above my understanding, so I’m not qualified.

      • CodexArcanum@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Physically Based Rendering (the freely available book) won its authors a special Academy award in 2014. That book is still the teaching standard for ray tracing so far as I know. In the intro, they discuss Pixar adding ray tracing (based on pbrt) to their RenderMan software in the early 2000s.

        A Bugs Life and TS2 could have benefit from some of that, but I’d guess Monsters Inc was the first full outing for it, and certainly by Nemo they must have been doing mostly ray tracing.