- cross-posted to:
- privacy@programming.dev
- cross-posted to:
- privacy@programming.dev
Grok, Elon Musk’s AI chatbot, has exposed hundreds of thousands of private user conversations through Google search indexing. When users click the “share” button to create a URL for sharing their chat, the conversation becomes publicly searchable - often without users realizing it[1][2].
Google has indexed over 370,000 Grok conversations, including sensitive content like medical questions, personal information, and at least one password[2:1]. Unlike OpenAI’s ChatGPT, which quickly removed a similar feature after backlash, Grok’s share function does not include any warning that conversations will become public[3].
According to Forbes, some marketers are already exploiting this feature by intentionally creating Grok conversations to manipulate search engine rankings for their businesses[2:2].
Really? OpenAI and Deepseek used are from Google, that meaans also, even if it is installed locally it also act online, same as other searches. You can’t run an complex LLM only locally in a crappy PC, what you run is an desktop client of the LLM, don’t confuse it.
You can’t run an LLM on a crappy PC, that’s true. You need at least a decent CPU. If you’re running an LLM locally, there’s no calls to the outside world. I have a very mid computer, it isn’t great, and unfortunately I need to work with LLMs due to my job. A call to my local LLM might take ~2 minutes where using an online platform it might take ~30 seconds, but I think that’s a reasonable trade.
If you have a gaming PC, you have a platform that can run a local LLM.