If even half of Intel’s claims are true, this could be a big shake up in the midrange market that has been entirely abandoned by both Nvidia and AMD.

  • bluemellophone@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    15 days ago

    There are some smaller Ollama Llama 3.2 models that would fit on 12GB. I’ve run some of the smaller Llama 3.1 models under 10GB on NVIDIA GPUs