• CeeBee@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I don’t know of an LLM that works decently on personal hardware

    Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.

    • ParetoOptimalDev
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      If you have really low specs use the recently open sourced Microsoft Phi model.