The movie Toy Story needed top-computers in 1995 to render every frame and that took a lot of time (800000 machine-hours according to Wikipedia).
Could it be possible to render it in real time with modern (2025) GPUs on a single home computer?
The movie Toy Story needed top-computers in 1995 to render every frame and that took a lot of time (800000 machine-hours according to Wikipedia).
Could it be possible to render it in real time with modern (2025) GPUs on a single home computer?
Yes and no.
You could get away with it with lots of tricks to down sample and compress at times where even an rtx 5090 with 32GB VRAM is like 1/64th of what you’d need to do in high fidelity.
So you could “do it” but it wouldn’t be “it”.