• I Cast Fist@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    6 days ago

    Does anyone know if it can run CUDA code? Because that’s the silver bullet ensuring Nvidia dominance in the planet-wrecking servers

    • peppers_ghost@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 days ago

      llama and pytorch support it right now. CUDA isn’t available on its own as far as I can tell. I’d like to try one out but the bandwidth seems to be ass. About 25% as fast as a 3090. It’s a really good start for them though.