• @frezik@midwest.social
    link
    fedilink
    English
    19 months ago

    Often, these models are a feedback loop. The input from one search query is itself training data that affects the result of the next query.

    • @FooBarrington@lemmy.world
      link
      fedilink
      English
      19 months ago

      Sure, but that’s not done with the kind of model this thread is about (separate training and inference). You’re talking about classical ML models with continuous updates, which you wouldn’t run on this kind of GPU infrastructure.