Rather than using ChatGPT's image skills to create Studio Ghibli-style pictures, a Greek woman decided to experiment with the trend of AI tasseography – a form of...
Let’s not kid ourselves. Publicly available information is invasive and a violation of privacy.
We have corporations who have effectively set up mass surveillance networks and they call it “adtech”.
There is an entire economy surrounding “publicly available information”. There are corporations that act as as data brokers and people search websites that compile way too much sensitive information about private individuals. Newspapers systematically report on events that aren’t really of public interest concerning private individuals; e.g. arrest records and these articles hang around forever even if the arrest doesn’t result in a conviction or the crime is expunged.
If this was employed by the government or law enforcement, it would absolutely include data that extends far beyond the reaches of publicly available information — and it’s worth pointing out that the US has a mass surveillance network in the form of the NSA/PRISM.
There is zero way you could convince me that AI, prone to hallucination, would be well served to predict crime or criminals. Even if it didn’t hallucinate, it still wouldn’t be possible to predict crime - only potentially anticipate a crime. We aren’t 2D characters following a script — anything can happen.
Law enforcement is already very unhinged. Let’s not cheerlead the addition of any tools that aid in psychosis to their arsenal.
Let’s not kid ourselves. Publicly available information is invasive and a violation of privacy.
We have corporations who have effectively set up mass surveillance networks and they call it “adtech”.
There is an entire economy surrounding “publicly available information”. There are corporations that act as as data brokers and people search websites that compile way too much sensitive information about private individuals. Newspapers systematically report on events that aren’t really of public interest concerning private individuals; e.g. arrest records and these articles hang around forever even if the arrest doesn’t result in a conviction or the crime is expunged.
If this was employed by the government or law enforcement, it would absolutely include data that extends far beyond the reaches of publicly available information — and it’s worth pointing out that the US has a mass surveillance network in the form of the NSA/PRISM.
There is zero way you could convince me that AI, prone to hallucination, would be well served to predict crime or criminals. Even if it didn’t hallucinate, it still wouldn’t be possible to predict crime - only potentially anticipate a crime. We aren’t 2D characters following a script — anything can happen.
Law enforcement is already very unhinged. Let’s not cheerlead the addition of any tools that aid in psychosis to their arsenal.