• Deebster@programming.dev
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 months ago

    What do you mean by this? I’d say you definitely could, although it would need an expert to sort through the candidates to find the ones with following up on.

    AI seems to have hit its limits, but this kind of stuff is what AI is great at.

    • Kairos
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      5 months ago

      It’d be too expensive.

      • Deebster@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        Not at all - you’d have to train a model but then it could be run locally. You could even have something like SETI at Home, and run it on volunteers’ computers.

        I suppose some social media charges for access and it might not be a good idea to scrape for something like this (or at least admit it).

        • Kairos
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          5 months ago

          Yeah lemme just download many petabytes a month of data on my home Internet connection

            • Kairos
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              5 months ago

              A few billion images yes

              • Deebster@programming.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                5 months ago

                You’re just making up big numbers and ignoring what I’m saying.

                Either it’s done centrally, in which case it’s feasible if they have funding, or it’s done SETI-style and users share the load (and could have data limits, etc).