• blobjim [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    Just running some data through the resulting model is still somewhat expensive since they have so many parameters. And of course for a lot of things, you want to train the model on new data you’re putting through it anyways.

      • blobjim [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 month ago

        In their defense, I’m sure there are tons of actually useful machine learning models that don’t use that much power once trained.

        I have an iPhone with Face ID and I think the way they did that was to train a model on lots of people’s faces, and they just ship that expensive-to-train model with the operating system and then it trains a little bit more when you use face ID. I can’t imagine it uses that much power since you’re running the algorithm every time you open the phone.

        I’m sure any model worth anything probably does require a lot of training and energy usage. I guess it really depends on the eventual utility whether it’s worth it.