• sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    23
    ·
    edit-2
    3 months ago

    Uh, then AMD wins the PC GPU wars, due to unexpected resignation of Nvidia, and Intel becomes the new AMD, in that market segment.

    And also some Chinese companies emerge as new PC GPU manufacturers, though what exact market strategy they would try to specialize in or pursue, is hard to predict.

    Anybody who either just wants a local compute gaming pc, or doesn’t have the best internet access / data caps… goes with AMD/Intel, ‘casuals’ go with renting their remote game rendering.

    The economic/cultural dynamics of pc gaming begin to resemble buying a new/used car vs leasing one, both get more financialized in their own ways.

    … Why does there need to be a whole article about this?

  • DaddleDew@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    3 months ago

    With China working hard to catch up with chip production, it is only a matter of time before we start seeing attractively priced Chinese made GPUs on the market. No idea how long it will take though.

    • 9488fcea02a9@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      3 months ago

      What makes you think chinese firms wont also jump on the AI bandwagon?

      someone with an actual CS/engineering background feel free to correct me, but i feel like the only way out of this for gamers is if someone finds a better type of chip for AI work. GPUs just happened to be the best thing for the job when the world went crazy, they were never designed specifically for these workloads.

      If someone like tenstorrent can design a RISC-V chip for these LLM workloads, it might take some demand off gaming GPUs.

      • otacon239@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 months ago

        You’ve got a good point. I wouldn’t be surprised if nVidia was working on a dedicated platform for AI to cover this exact issue. Then again, I would be equally unsurprised if they just didn’t care and didn’t mind gutting the home gaming market for short-term profit.

      • Jhex@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        3 months ago

        What makes you think chinese firms wont also jump on the AI bandwagon?

        the bubble won’t last that long

        • chaogomu@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          3 months ago

          The only thing that will burst the bubble is electricity.

          The Dotcom bubble burst due to Dark Fiber, all because massive Internet backbones were easy to build, and the last mile to people’s homes, was not.

          The current electrical grid cannot support the number of data centers being built. The ones that are planned on top of that… Well dark data centers will be the new dark fiber.

          There’s more complexity to it all, but really it all boils down to power for this particular bubble.

          • Jhex@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 months ago

            or lack of use? the current trend is fueled by hype that AI can do everything and will sub 50% of the work force, another nightmare scenario… however, current AI may be an Ok tool for some jobs and not much more, dsthe world does not need 200 Gwatts of AI datacentres to produce memes

            • chaogomu@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              3 months ago

              Data centers are already paid for, so they’re being built. But if they can’t go online due to power costs. Then that will burst the bubble.

              As for AI use… Sadly there are a bunch of people using it. And while the drop off rate of people trying it and ditching it is steep, there’s actually a readoption curve. Which never fucking happens.

              So everyone is betting on the next model being better, and more people giving it all a second chance… Which are two open questions.

              But no power means no new model and no readoptions. This no profit. Those other steps can fail, but without power it all fails.

              • Jhex@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 months ago

                Data centers are already paid for, so they’re being built.

                no they are not… they are contracted in paper by openAI, for example, who has no way of paying for them other than “trust me bro”

                But if they can’t go online due to power costs

                it’s not just the cost… the infrastructure to produce all this additional power does not exist… another issue with these massive bubble

                As for AI use… Sadly there are a bunch of people using it. And while the drop off rate of people trying it and ditching it is steep, there’s actually a readoption curve. Which never fucking happens.

                Served by the current infra, why do we need the next 200 gwatt for? to make memes faster?

                So everyone is betting on the next model being better, and more people giving it all a second chance… Which are two open questions.

                by everybody you mean those ped.ling the bubble… we already saw the decline in the newer models and know it´s matematically impossible to get rid of the slop or get to “gen ai” through LLMs

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    5
    ·
    3 months ago
    1. Nvidia abandons x86 desktop gamers
    2. The only hardware that gamers own are ARM handhelds
    3. Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking
    4. AI bubble pops
    5. Nvidia tries to regain x86 desktop gamers
    6. Gamers are almost entirely on ARM
    7. Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else
    • carpelbridgesyndrome@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 months ago

      Nvidia does not care about the ISA of the CPU at all. They don’t make it after all. Also not clear how they would kill x86. If they leave the consumer GPU market they cede it to AMD and Intel.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        3 months ago

        Nvidia does not care about the ISA of the CPU at all.

        That’s kinda my point. They’re stuck communicating over PCI-E instead of being a first-class co-processor over AMBA.

  • brucethemoose@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    10
    ·
    3 months ago

    AFAIK, the H100 and up (Nvidia’s bestselling data center GPUs) can technically game, but they’re missing so many ROPs that they’re really bad at it. There will be no repurposing all those AI datacenters for cloud gaming farms.

  • mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    3 months ago

    PC gaming itself will hardly change, because AMD cards work just fucking fine. They’ve only ever been a little bit behind on the high end. They’ve routinely been the better value for money, and offered a much lower low end. If they don’t have to keep chasing the incomparable advantages Nvidia pulls out of their ass, maybe they can finally get serious about heterogenous compute.

    Or hey, maybe Nvidia ditching us would mean AMD finds the testicular fortitude to clone CUDA already, so we can end this farce of proprietary computation for your own god-damn code. Making any PC component single-vendor should’ve seen Nvidia chopped in half, long before this stupid bubble.

    Meanwhile:

    Cloud gaming isn’t real.

    Anywhere after 1977, the idea that consumers would buy half a computer and phone in to a mainframe was a joke. The up-front savings were negligible and difference in capabilities did not matter. All you missed out on were your dungeon-crawlers being multiplayer, and mainframe operators kept trying to delete those programs anyway. Once home internet became commonplace even that difference vanished.

    As desktop prices rose and video encoding sped up, people kept selling the idea you’ll buy a dumb screen and pay to play games somewhere else. You could even use your phone! Well… nowadays your phone can run Unreal 5. And a PS5 costs as much as my dirt-cheap eMachines from the AOL era, before inflation. That console will do raytracing, except games don’t use it much, because it doesn’t actually look better than how hard we’ve cheated with rasterization. So what the fuck is a datacenter going to offer, with 50ms of lag and compression artifacts? Who expects it’s going to be cheaper, as we all juggle five subscriptions for streaming video?

  • AMillionNames@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 months ago

    They are in cahoots with the RAM cartels to push gaming onto their cloud services so that competitors like AMD don’t just pick them up. Trying to make everything into a service is just a side benefit, although I’m sure they realize 16 bit SNES games are still fun and that people will just be driven to less powerful entertainment platforms.

  • etherphon@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    3 months ago

    Games and gaming have fully become like Hollywood and Silicon Valley and I expect zero good things from them at this point. As with movies and music, now most of the good stuff will be from individuals and smaller enterprises. The fact is today’s GPUs have enough power to do extraordinary things. The hardware these days moves so fast, no one is squeezing any performance out of anything like they used to have to do. And not every game needs photo realistic ray traced graphics, so these GPUs will be fine for many gamers so long as they remain supported through drivers.