• slacktoid@lemmy.ml
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    7 days ago

    Where can I buy this?

    Edit: I realized after I commented this was the product page… My bad. It was more of a take my money now scenario

      • slacktoid@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 days ago

        Why wouldn’t it? (Like I’m thinking why would they support Microsoft, and the only other viable option is FreeBSD)

        • eldavi@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          the world still uses windows heavily so adoption for the end consumer relies on it.

  • geneva_convenience@lemmy.ml
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    6 days ago

    For inference only. NVIDIA GPU’s are so big because they can train models. Not just run them. All other GPU’s seem to lack that capacity.

  • I Cast Fist@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    5 days ago

    Does anyone know if it can run CUDA code? Because that’s the silver bullet ensuring Nvidia dominance in the planet-wrecking servers

    • peppers_ghost@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      llama and pytorch support it right now. CUDA isn’t available on its own as far as I can tell. I’d like to try one out but the bandwidth seems to be ass. About 25% as fast as a 3090. It’s a really good start for them though.

  • uberstar@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    6 days ago

    I kinda want an individual consumer-friendly, low-end/mid-end alternative that can run my games and video editing software for very small projects… so far I’m only eyeing the Lisuan G100, which seems to fit that bill…

    This seems cool though, other than AI, it could be used for distributed cloud computing or something of that sort