Running AI models without matrix math means far less power consumption—and fewer GPUs?

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    I don’t think that making LLMs cheaper and easier to run is going to “pop that bubble”, if bubble it even is. If anything this will boost AI applications tremendously.