• dogslayeggs@lemmy.world
    link
    fedilink
    English
    arrow-up
    78
    arrow-down
    2
    ·
    5 months ago

    I didn’t know there were that many PC gamers out there. /s

    Seriously, though, the pivot from making video cards to investing in AI and crypto is kinda genius. The crypto thing mostly fell into their laps, but they leaned in. The AI thing, though, I’m not sure how they decided to focus on that or who first pitched the idea to the board; but that was business genius.

    • dkc@lemmy.world
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      3
      ·
      5 months ago

      To your point, when you look at both crypto and AI I see a common theme. They both need a lot of computation, call it super computing. Nvidia makes products that provide a lot of compute. Until Nvidia’s competitors catch up I think they’ll do fine as more applications that require a lot of computation are found.

      Basically, I think of Nvidia as a super computer company. When I think of them this way their position makes more sense.

      • Aceticon@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        5 months ago

        Also those thing are highly parallelizable and mainly deal with vector and matrix data, so the same “lots of really simple but fast processing units optimized for vectors and matrix operations working in parallel” that works fine for modern 3D Graphics (for example, each point on a frame image to display on the screen can be calculated in parallel with all the other points - in what’s called a fragment shader - and most 3D data is made of 3D vectors whilst the transforms are 3x3 Matrices) turns out to also work fine for things like neural networks were the neurons in each layer are quite simple and can all be processed in parallel (if the architecture of that wasn’t layered, GPUs would be far less effective for it).

        To a large extent Nvidia got lucky that the stuff that became fashionable now works by doing lots of simple and highly paralellizeable computations, since otherwise it would’ve been the makers of CPUs that gained from the rise of said computing power demanding tech.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      edit-2
      5 months ago

      They were doing that for years before it became popular. The same tech for video graphics just so happened to be useful for AI and big data, and they doubled down on supporting enterprise and research efforts in that when it was a tiny field before their competitors did, and continued to specialize as it grew.

      Supporting niche uses of your product can sometimes pay off if that niche hits the lottery.

      • webghost0101@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 months ago

        Hardware made for heavy computing being good at stuff like this isn’t all that schokking though. The biggest gamble is if new technology will take off at all. Nvidia, just like google has the capital to diversify, bet on all the horses at once to drop the losers later.

    • chrash0@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      5 months ago

      same as with crypto. the software community started using GPUs for deep learning, and they were just meeting that demand

    • RecallMadness@lemmy.nz
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      5 months ago

      They were first to market with a decent GPGPU toolkit (CUDA) which built them a pretty sizeable userbase.

      Then when competitors caught up, they made it as hard as possible to transition away from their ecosystem.

      Like Apple, but worse.

      I guess they learned from their Gaming heyday that not controlling the abstraction layer (eg OpenGL, DirectX, etc) means they can’t do lock in.

    • slacktoid@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      ·
      5 months ago

      To their credit they’ve been pushing GPGPUs for a while. They did position themselves well for accelerators. Doesn’t mean they don’t suck.

    • swayevenly@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 months ago

      DLSS was a necessity to make gains at speeds their hardware could not keep up with.