Documentary filmmakers were publishing guidelines on how to ethically use generative AI right as Netflix’s true crime doc was adding fake images to the historical record.
Documentaries often include recreations of events, such as historical events that weren’t filmed. It’s usually noted as being a recreation or re-enactment. If AI-created images are used instead and are noted as being such, I don’t really see the problem, assuming the images are curated to depict the scene accurately.
This is how I’m leaning too. If done appropriately this should be no different than “this is a reenactment of events” seen in 90s and 00s true crime shows.
The big challenge is getting the content creators to respect that template and not bury the disclosure in the credits.
A recreation is a scripted recreation, and I believe legally required to be noted as such. Whether that’s in the credits or on screen at time of playing I think is at the discretion of the filmmaker and editors.
Wildly different concept than generative AI models doing whatever they feel. At the end of the day, I can see why some people can’t see the difference, but it’s huge. I’d also say that if the former were improperly used in a horrific way, you’d just say “Well the viewers can stay away from that documentary”, but as we we’ve all seen over the past decade or so, once the falsely represented account of events is out there, you can’t stop it from spreading. Whether is a still image, or a reenactment. One has current legal repercussions and is covered by libel and slander protections, and the other doesn’t. World of difference.
I… I don’t think they are generating the history on the fly for each individual playback. Probably just generating images based on the concept, iteratively tweaking until it conveys the message that is desired by the artist. You know. Like most artistic works. AI is another tool.
Not to say training data being copped from hardworking artists is good, but an ethically trained AI for image generation for this context is not necessarily evil if it is used in the context of executing the artist’s vision in the way they deem necessary and sufficient. Relying on outside people can often cloud the vision of a project.
That being said, pay artists for their work, license if you want to train, and credit/royalties should be paid until copyright expires or the rights are purchased outright for a competitive compensation.
The point is more that false “recreations” are protected when you have a planned and scripted setup to film and display it. Generative AI is not included in those laws yet, which is why everyone is trying to get their bullshit in while they can.
Documentaries often include recreations of events, such as historical events that weren’t filmed. It’s usually noted as being a recreation or re-enactment. If AI-created images are used instead and are noted as being such, I don’t really see the problem, assuming the images are curated to depict the scene accurately.
The problem in both cases is that people remember these artistic depiction as real, even if there’s a disclosure.
Are we worrying about the fully functional adults that still need to be told not to drink Draino?
It wouldn’t be such a concern if they didn’t make up like 40% of the population.
Global population? You say “the”, so you obviously mean the one we have in common.
We’re all susceptible to this stuff, even when we’re aware of it.
As someone who actually worked in the corporate propaganda industry… I concur.
If you think you are impervious to this, then I got news for you.
I think I’m pretty impervious to the impulse of drinking drain cleaner. 🤷
Ok but drinking draino is the cure for all life’s problems. To each their own, though.
That or seeing Batman.
Yeah television doesn’t affect anyone. That’s been a great success. Fox News anybody? Pizzagate?
Don’t people attack crime scene re-enactors?
That argument extends to any realistic recreation of events. It’s not wrong, I’m just not sure what could be done about it.
This is how I’m leaning too. If done appropriately this should be no different than “this is a reenactment of events” seen in 90s and 00s true crime shows.
The big challenge is getting the content creators to respect that template and not bury the disclosure in the credits.
Yeah they shouldn’t do that either
A recreation is a scripted recreation, and I believe legally required to be noted as such. Whether that’s in the credits or on screen at time of playing I think is at the discretion of the filmmaker and editors.
Wildly different concept than generative AI models doing whatever they feel. At the end of the day, I can see why some people can’t see the difference, but it’s huge. I’d also say that if the former were improperly used in a horrific way, you’d just say “Well the viewers can stay away from that documentary”, but as we we’ve all seen over the past decade or so, once the falsely represented account of events is out there, you can’t stop it from spreading. Whether is a still image, or a reenactment. One has current legal repercussions and is covered by libel and slander protections, and the other doesn’t. World of difference.
I… I don’t think they are generating the history on the fly for each individual playback. Probably just generating images based on the concept, iteratively tweaking until it conveys the message that is desired by the artist. You know. Like most artistic works. AI is another tool.
Not to say training data being copped from hardworking artists is good, but an ethically trained AI for image generation for this context is not necessarily evil if it is used in the context of executing the artist’s vision in the way they deem necessary and sufficient. Relying on outside people can often cloud the vision of a project.
That being said, pay artists for their work, license if you want to train, and credit/royalties should be paid until copyright expires or the rights are purchased outright for a competitive compensation.
The point is more that false “recreations” are protected when you have a planned and scripted setup to film and display it. Generative AI is not included in those laws yet, which is why everyone is trying to get their bullshit in while they can.