The movie Toy Story needed top-computers in 1995 to render every frame and that took a lot of time (800000 machine-hours according to Wikipedia).

Could it be possible to render it in real time with modern (2025) GPUs on a single home computer?

  • BuelldozerA
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    2 days ago

    There is no comparison between a top of the line SGI workstation from 1993-1995 and a gaming rig built in 2025. The 2025 Gaming Rig is literal orders of magnitude more powerful.

    In 1993 the very best that SGI could sell you was an Onyx RealityEngine2 that cost an eye-watering $250,000 in 1993 money ($553,000 today).

    A full spec breakdown would be boring and difficult but the best you could do in a “deskside” configuration is 4 x single core MIPS processors, either R4400 at 295Mhz or R10000 at 195Mhz with something like 2GB of memory. The RE2 system could maybe pull 500 Megaflops.

    A 2025 Gaming Rig can have a 12 core (or more) processor clocked at 5Ghz and 64GB of RAM. An Nvidia 4060 is rated for 230 Gigaflops.

    A modern Gaming Rig absolutely, completely, and totally curb stomps anything SGI could build in the early-mid 90s. The performance delta is so wide it’s difficult to adequately express it. The way that Pixar got it done was by having a whole bunch of SGI systems working together but 30 years of advancements in hardware, software, and math have nearly, if not completely, erased even that advantage.

    If a single modern gaming rig can’t replace all of the Pixar SGI stations combined it’s got to be very close.