• Alexstarfire@lemmy.world
    link
    fedilink
    arrow-up
    38
    arrow-down
    7
    ·
    edit-2
    1 day ago

    Isn’t vram usually bigger than ram? Those pics should be switched.

    EDIT: Oh, I took vram to be virtual ram, not video ram. It makes sense for video ram.

    • FlexibleToast@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 day ago

      Creating your swap as 2x your RAM is outdated advice. Now it’s essentially changed to be 2x until 4GB of RAM, then 1x until 8GB, and anything over 8GB just use 4GB of swap because you probably have enough RAM. Or, even some modern systems like Fedora will swap to zRAM. Which is just a highly compressed portion of RAM.

      • wax@feddit.nu
        link
        fedilink
        arrow-up
        5
        ·
        1 day ago

        I think that recommendation came partly due to hibernation, where the ram is dumped to disk before powering off. Today, I’d probably use a swapfile instead.

        • Smee@poeng.link
          link
          fedilink
          arrow-up
          3
          ·
          16 hours ago

          Swap files are just a file version of the swap partition. I need a 24GB swap file to hibernate.

    • cm0002@lemmy.worldOP
      link
      fedilink
      arrow-up
      18
      arrow-down
      1
      ·
      edit-2
      1 day ago

      It depends on your definition of “usually”, high end GPUs for data centers, AI, workstations or “enthusiasts” yea. For these applications you’re starting at like 16

      GPUs for us plebs, no

      • BombOmOm@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        edit-2
        1 day ago

        It’s also fairly cheap to buy 32+ GB of RAM, lots of choices for under $80. Meanwhile, I’m not even sure how you find a video card with 32GB of VRAM (not that you really need this much, 12GB and 16GB are pretty solid for a video card nowadays).

      • Lucy :3@feddit.org
        link
        fedilink
        arrow-up
        5
        ·
        1 day ago

        Tbf, we should be starting with 16GB for gaming GPUs too, especially for those prices. But … NVidia.

        But yeah, modern HPC Processors have at least 48GB or so. And max. is the AMD Mi355X with 288GB VRAM afaik. Which is actually less than my servers RAM, ha! But also probably like a thousand times fasted, considering my RAM runs at 1600 MT/s.

        • zurohki@aussie.zone
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 day ago

          I’m seeing games today regularly hitting 11 GB, and that’s without raytracing or frame generation which require more VRAM.

          The new 8GB GPU Nvidia just launched is a trap. It exists to trick people into buying a GPU that they’ll need to upgrade next year.

      • zurohki@aussie.zone
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 day ago

        If you have an 8GB GPU that’s a few years old, it’s probably doing okay-ish. It probably doesn’t have the performance to really suffer from VRAM limits and you don’t game with things like raytracing or ultra detail settings turned on because the GPU isn’t fast enough for those things anyway.

        My Vega 64 had 8GB VRAM and that was fine.

        If you buy one of the new GPUs with 8GB though, the VRAM is a huge problem. You have the GPU power to have all the features turned on, but you’re going to see real performance crippled because it overflows VRAM.

        Longevity is the other issue - when games released in 2025 run like ass on your 8GB GPU from 2017, you won’t be surprised. Bad performance from an 8GB GPU that released in 2025 for $500, that’s a problem.

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      Normally you don’t even have that much virtual ram. It’s at most twice your system ram, but honestly past 8gb and you’re gonna want to start closing out of stuff.