AI-infused
< Next Topic | Back to topic list | Previous Topic >
Posted by Skywatcher
Sep 25, 2025 at 09:39 AM
Somehow, the idea of having a cloud LLM train on thousands of personal notes , and spitting them somewhere else to other users , even indirectly, doesn’t excite me much. Cloud LLM’s are another privacy scandal in the making ( just look at what is happening with OpenAI having to disclose thousands of personal chats of their users with ChatGpt, during the lawsuit against them by the NY Times ).
Integration with local LLMs would be a good addition for those who aren’t into disclosing their entire private life once again with OpenAI, Anthropic, Meta / Google etc..
Posted by Skywatcher
Sep 25, 2025 at 09:42 AM
Posted by Amontillado
Sep 25, 2025 at 08:32 PM
LinkedIn is widely reported to start forwarding their user data to Microsoft for AI training.
Opt out now, before it’s too late.
The actual LinkedIn announcement said the training would start in Europe in another month. LinkedIn was sued last year for training AI models on user data without permission.
If AI is trained on contemporary data, it’s going to be trained at least a little (and probably a lot) on its own output, a process I like to refer to as a closed loop alimentary canal.
Frankly, it’s enough to make me want to stop bathing, get a ratty robe, and chant “Pie Jesu, Domine” while slapping myself in the forehead with a deck of punch cards.
Harumph!
Posted by gunars
Sep 25, 2025 at 09:21 PM
Amontillado wrote:
>Frankly, it’s enough to make me want to stop bathing, get a ratty robe,
>and chant “Pie Jesu, Domine” while slapping myself in the forehead with
>a deck of punch cards.
Just make sure you have sequence numbers in columns 73-80.
Posted by MadaboutDana
Sep 26, 2025 at 03:36 PM
Speaking of MCP, it might be worth taking a look at this report: https://www.itpro.com/security/a-malicious-mcp-server-is-silently-stealing-user-emails
Many software engineers have warned that MCP is singularly insecure…
Just sayin’!
Bill