Sorry, aber gerade ist bei uns wirklich gar nix zu holen. Unser Server ist im Wartungsmodus – und unser kleiner Bot versucht gerade, mit letzter Energie die Fehlermeldungen zu sortieren.
Ob’s ein Daten-Leck, ein Stromausfall im Cluster oder einfach nur ein mies gelauntes Bit war – wir wissen es noch nicht. Was wir aber wissen:
Das Daten-Drama eskalierte zu einer Server-Schmelze. Aber keine Sorge: Wir sind dran.
Was jetzt?
Bei unseren Kolleg:innen von inside digital läuft bestimmt alles rund – schau doch mal vorbei!
Oder du vertreibst dir die Zeit mit einem Besuch auf unseren Social-Media-Kanälen – zum Beispiel Instagram, YouTube oder TikTok. Da gibt’s immerhin keine Serverprobleme – versprochen.
Danke für deine Geduld. Unser Bot rebootet schon mit Hochdruck. 🔧
NewsSaying This To ChatGPT Could Actually Land You in Jail
ChatGPT is a powerful and flexible AI chatbot. It can answer complex questions and even discuss anything you want with it. However, OpenAI CEO Sam Altman has issued a stark warning to users who share sensitive information with the chatbot: your chats are not legally protected and could be used as evidence against you during lawsuits.
Opinion: How ChatGPT educates our children
Why You Shouldn’t Share Sensitive Information with AI
OpenAI’s ChatGPT has gained popularity, with some users turning to the chatbot as a therapist or life coach for sharing personal life details and receiving relevant advice. Although these chats might seem private, the company’s CEO, Sam Altman, noted in a recent podcast interview that your conversations lack legal privacy protections.
“I think we will certainly need a legal or a policy framework for AI,” Altman responded to podcaster Theo Von’s question. Altman continued, “So, if you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, we could be required to produce that, and I think that’s very screwed up.”
Empfohlener redaktioneller Inhalt
Dieser externe Inhalt von YouTube wurde von der nextpit Redaktion ausgewählt und ergänzt den Artikel. Du hast die Wahl, ob du diesen Inhalt laden möchtest.
Externen Inhalt erlauben
Ich bin damit einverstanden, dass externer Inhalt geladen wird. Personenbezogene Daten werden womöglich an Drittplattformen übermittelt. Nähere Informationen enthält die Datenschutzerklärung.
The CEO highlighted that, unlike conversations with a real-life therapist, lawyer, or doctor, which are protected by privilege, interactions with ChatGPT don’t enjoy the same legal safeguard. This means that OpenAI can be forced to disclose your chat records if required by law.
AI Chatbots Are Still Not Covered by Legal Protection
Altman said, “We should have, like, the same concept of privacy for your conversations with AI that we do with a therapist or whatever.”
He added that the lack of specific privacy protection for AI has only recently come into the spotlight, and that this issue needs to be addressed immediately.
Conversations with ChatGPT are not typically end-to-end encrypted, and OpenAI’s policy allows them to view your chats for the purposes of safety monitoring and training the AI model.
Although users can delete their conversations, and OpenAI typically deletes these permanently after 30 days based on the company’s data retention policy, an ongoing lawsuit by The New York Times and other news publications is now forcing them to retain all records indefinitely.
Instead, Altman suggested that users should have a clear understanding of the fair use privacy policy if they plan to use AI extensively. Alternatively, there are workarounds to ensure more secure and private use of AI, such as running similar models offline, like GPT4All by Nomic AI and Ollama.
Do you use ChatGPT a lot? What measures do you suggest to keep your chats safer? We want to hear your thoughts.
We mark partner links with this symbol. If you click on one of these links or buttons–or make a purchase through them–we may receive a small commission from the retailer. This doesn’t affect the price you pay, but it helps us keep nextpit free for everyone. Thanks for your support!
0 comments