The New York Times sues OpenAI and Microsoft for copyright infringement::The New York Times has sued OpenAI and Microsoft for copyright infringement, alleging that the companies’ artificial intelligence technology illegally copied millions of Times articles to train ChatGPT and other services to provide people with information – technology that now competes with the Times.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Here’s the author’s bio:

    Kit is a senior staff attorney at EFF, working on free speech, net neutrality, copyright, coders’ rights, and other issues that relate to freedom of expression and access to knowledge. She has worked for years to support the rights of political protesters, journalists, remix artists, and technologists to agitate for social change and to express themselves through their stories and ideas. Prior to joining EFF, Kit led the civil liberties and patent practice areas at the Cyberlaw Clinic, part of Harvard’s Berkman Center for Internet and Society, and previously Kit worked at the law firm of Wolf, Greenfield & Sacks, litigating patent, trademark, and copyright cases in courts across the country.

    Kit holds a J.D. from Harvard Law School and a B.S. in neuroscience from MIT, where she studied brain-computer interfaces and designed cyborgs and artificial bacteria.

    The author is well aware of the legal side of things.

    • maegul (he/they)@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Oh I’m sure, and it was a good article to be clear. But “the legal side of things”, especially from a certain perspective, and what the courts (and then the legislature) do with a new-ish issue can be different things.

      • kromem@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        “Kit worked at the law firm of Wolf, Greenfield & Sacks, litigating patent, trademark, and copyright cases in courts across the country. Kit holds a J.D. from Harvard Law School”

        The EFF is primarily a legal group and the post straight up mentions that it is a legal opinion on the topic.

        So I’m not really clear what “the legal side of things” is that you mean separate from what a lawyer who has litigated IP cases before and works focused on the intersection of law and tech says about a pending case in a legal opinion.

        Do you just mean a different opinion from different lawyers?

        • Kilgore Trout@feddit.it
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I assume they mean that on topics not thoroughly tested in court, a judge can always make up their own mind based on how they personally feel.

        • maegul (he/they)@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Yea. Legal opinions vary and legal scholars can have problems, sometimes massive, with what courts and legislators end up doing. “Legal side of things”, in quotes, was intended to convey a cynicism/critique of the idea, belief or even desire some might have (not saying you) for the law to be “settled” and clear.

          • kromem@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Ah, well if you want the Columbia Journalism Review has a good summary of the developments and links to various other opinions, particularly in the following paragraph:

            According to a recent analysis by Alex Reisner in The Atlantic, the fair-use argument for AI generally rests on two claims: that generative-AI tools do not replicate the books they’ve been trained on but instead produce new works, and that those new works “do not hurt the commercial market for the originals.” Jason Schultz, the director of the Technology Law and Policy Clinic at New York University, told Reisner that there is a strong argument that OpenAI’s work meets both of these criteria. Elsewhere, Sy Damle, a former general counsel at the US Copyright Office, told a House subcommittee earlier this year that he believes the use of copyrighted work for AI training is categorically fair (though another former counsel from the same agency disagreed). And Mike Masnick of Techdirt has argued that the legality of the original material is irrelevant. If a musician were inspired to create new music after hearing pirated songs, Masnick asks, would that mean that the new songs infringe copyright?

            (Most of those opinions are linked)