• @CeeBee@lemmy.world
    link
    fedilink
    26 months ago

    I don’t know of an LLM that works decently on personal hardware

    Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.

    • @ParetoOptimalDev
      link
      16 months ago

      If you have really low specs use the recently open sourced Microsoft Phi model.