- cross-posted to:
- technology@hexbear.net
- piracy@lemmy.ml
- cross-posted to:
- technology@hexbear.net
- piracy@lemmy.ml
That’s fine, just let the rest of us do the same.
Actually I prefer if individual users pirating being considere fair use, but corporation pirating not be considered fair use. So them pirating is not fine but us pirating should be.
Yeah too much of this thread is so hypocritical, but either free to copy stuff should be free or it shouldn’t.
“We didn’t do it, and if we did it was fair use, and if it wasn’t progress will be hampered if rules and regulations are too strict.”
Nationalize AI or tax it to fund UBI, and none of this is an issue.
Best idea I’ve heard in a year. Automation should benefit humanity as a whole.
I do wonder how it shakes out. If the case establishes that a license to use the material should be acquired for copyrighted material, then maybe the license I’m setting on comments might bring commercial AI companies in hot water too - which I’d love. Opensource AI models FTW
That license would require the AI model to only output content under the same license. Not sure if you realize, but commercial use is part of the OpenSource definition:
Your content would just get filtered out from any training dataset.
As for going against commercial companies… maybe you are a lawyer, otherwise good luck paying the fees.
AI is just too much of a hype. Every company invests millions into AI and all new products need to “have AI”. And then everybody also needs to file lawsuits. I mean rightly so if Meta just pirated the books, but that’s not a problem with AI, but plain old piracy.
I was pretty sure OpenAI or Meta didn’t license gigabytes of books correctly for use in their commercial products. Nice that Meta now admitted to it. I hope their " Fair Use" argument works and in the future we can all “train AI” with our “research dataset” of 40GB of ebooks. Maybe I’m even going to buy another harddisk and see if I can train an AI on 6 TB of tv series, all marvel movies and a broad mp3 collection.
Btw, there was no denying anyways. Meta wrote a scientific paper about their LLaMA model in march of last year. And they clearly listed all of their sources, including Books3. Other companies aren’t that transparent. And even less so as of today.
Welp, whole trained dataset got DMCAed, right? And a nonsensical fine, right?
ohno my copyright!!! How will the publisher megacorps now make a record quarter??? Think of the shareholders!
Nope. Yer can feck off Zuck! Yer ain’t comin’ aboard my ship! 🏴☠️
I’m pretty sure “admits” implies an attempt to hide it. They’ve explicitly said in the model’s initial publication that the training set includes Books3.
deleted by creator
In the age of the internet, nothing is truly yours.
Just look at NFT’S
How are NFTs relevant?
they aren’t, except perhaps as a counterexample of some dubious sort
They were supposedly anchors to claim ownership of things in the real world.
They’re fancy receipts, and if people thought of them as just that it might be a technology with some limited non-monetary uses. But, the crypto grift was too strong.
Marking all your comments CC BY-NC-SA is a good bit.
The point of NFTs (beyond the pyramid scheme) was to enforce artificial digital scarcity at the individual level
They sold snake oil nothing else.
This is the least shocking revelation.
Removed by mod
What a bunch of losers, thinking they are making the future…… by stealing from as many artists as they can? How do you convince yourself you are doing the right thing when what you are doing is scaling up the theft of art from small artists to a tech company sized operation?
And how much oxygen has been wasted over the years by music companies pushing the narrative that “stealing” from artists with torrenting is wrong? This is so much worse than stealing (and a million times worse than torrenting) though because the point of the theft is to destroy the livelihood of the artist who was stolen from and turn their art into a cheap commodity that can be sold as a service with the artist seeing none of the monetary or cultural reward for their work.
Did you just make a contradictory argument for both sides?
Is your distinction that piracy by individuals gives cultural recognition while that of corporations doesn’t?
If you think piracy is warranted, at the cost of artists/creators, how is a generalized AI that makes it available and more accessible as a cultural abstracted good different?
I’m going to imagine it’s because that cultural abstracted good is then put behind a pay wall, which OP will theb also pirate, thus fulfilling the prophesy.
Because I don’t see a strong argument for piracy coming at a direct, immutable cost to artists. I also don’t see a strong argument that piracy reduces the chance fans will pay for art when the art is made decently easy to purchase and is being sold at a reasonable price. Of course there are complexities to this discussion but ultimately when you compare it to massive corporations wholesale stealing massive amounts of works of art with the specific intention of undercutting and destroying the value of said art by attempting to commodify it I think the difference is pretty clear. One of these things is a morally arguable choice by one individual, the other is class warfare by the rich.
Joe shmo torrents an album from a band they like, maybe they buy the album in the future or go to a band concert and buy merch. Joe shmo hasn’t mined some economic gain out of a band and then moved on, Joe shmo has become more of a committed fan because they love the album. Meta steals from a band so that they can create an algorithm that produces knockoff versions of the band’s music that Meta can sell to say a company making a commercial who wants music in that style but would prefer not to pay an actual human artist an actual fair price for the music. These are not the same.
(AI doesn’t create convincing fake songs yet necessarily, but you get my point as it applies to other art that AI can create convincing examples of, books and writing being a prime example)
What a bunch of losers, thinking they are making the future…… by stealing from as many artists as they can?
Are you aware of which community this is posted in?
Meta stealing intellectual property and utilizing it for corporate gain is not the same as normal users pirating content. They are so far apart that it warrants its own discussion and cannot be lumped in together.
I didn’t realize at first, my bad. I realize that makes a lot of my post redundant but I think my point still stands.
So much hypocrisy that a massive corporation can actually steal like this and it is more socially acceptable than torrenting.
And that’s the issue I in particular have. It’s a double standard and not only that, they’re using it to generate money for their own tools
It’s not the same as some kid pirating photoshop to play around with, or a couple who is curious about GOT and want to watch it without paying HBO.
This is a separate issue and I hate that this place is so reddit like that trying to talk about it gets “hurrr dur I guess you’re mad because AI and meta are just the current hate train circle jerk hurrr i form my own opinions hurr”
Like, no, I’m upset because this is a whole new topic of piracy use.
I’m not upset because I think it is totally irrelevant because training AI is not reproducing any works and it is no different than a person who reads or sees said works talking about or creating in the style of said works.
At the core, this amounts to thought policing as the final distilled issue if this is given legal precedent. It would be a massive regression of fundamental human rights with terrible long term implications. This is no different than how allowing companies to own your data and manipulate you has directly lead to a massive regression of human rights over the last 25 years. Reacting like foolish luddites to a massive change that seems novel in the moment will have far reaching consequences most people lack the fundamental logic skills to put together in their minds.
In practice, offline AI is like having most of the knowledge of the internet readily available for your own private use in a way that is custom tailored to each individual. I’m actually running large models on my own computer daily. This is not hypothetical, or hyperbole; this is empirical.
deleted by creator
deleted by creator
deleted by creator
deleted by creator