• @tal
    link
    English
    3
    edit-2
    21 days ago

    I don’t really follow consoles, but I’ll take a guess based on what limited information is about the thing in the article.

    If you figure that PC and various console hardware has converged to a fair degree over the decades and that stuff is gonna get generally ported around anyway, it’s hard to differentiate yourself on game selection or hardware features. Plus you’ve got antitrust regulators going after console vendors buying games to be exclusives, and that also tamps down on that.

    So okay, say what you can compete on is in significant part how you run what is more or less the same set of games. Most games already have rendering code that can scale pretty well with hardware for the PC.

    It might make sense to make sure that you have faster rendering hardware so that it’s your version that looks the nicest (well, or at least second nicest, hard to compete with the PC’s hardware iteration time for users willing to buy the latest-and-greatest there).

    Let me extrapolate one further. It might even make sense, if that’s the direction of things, for console vendors to make some kind of cartridge containing the GPU, something with a durable, idiot-proof upgrade that you don’t have to open the console to do, and to let users upgrade their console to the next gen mid-lifecycle at a lower cost than getting a new console. Controllers haven’t changed all that much, and serial compute capabilities aren’t improving all that much annually. The thing that is improving at a good clip is parallel compute.

    • @conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      320 days ago

      Having an APU is part of how they get the price points they do. A separate GPU would cost more on its own, would need its own memory instead of the shared pool, costing more, and the end result would be meaningfully more expensive for customers to upgrade than it is for them to sell their old system and buy a pro.