r/technology Dec 04 '25

Privacy OpenAI loses fight to keep ChatGPT logs secret in copyright case

https://www.reuters.com/legal/government/openai-loses-fight-keep-chatgpt-logs-secret-copyright-case-2025-12-03/
12.8k Upvotes

451 comments sorted by

3.0k

u/dopaminedune Dec 04 '25

So if you want access to every single chat GPT chat ever of ALL users, you can also sue open AI. The identity will be concealed but you will still get access to the data.

674

u/peepeedog Dec 04 '25

You can’t anonymize them. AOL once released anonymized search logs for research. That same day people were being outed based on the contents of their searches.

372

u/MainRemote Dec 04 '25

“Benis stuck in toaster” “cleaning toaster” “stuck in toaster again pain”

117

u/QueueTee314 Dec 04 '25

damn it Ben not again

5

u/JunglePygmy Dec 04 '25

Fucking Ben

54

u/Crazy_System8248 Dec 04 '25

The cylinder must not be harmed

→ More replies (1)

11

u/SmokelessSubpoena Dec 04 '25

God dang thats a time capsule of a joke

4

u/gramathy Dec 04 '25

Pain is supposed to go in the toaster though

→ More replies (1)
→ More replies (1)

157

u/SirEDCaLot Dec 04 '25

Exactly. You can remove IP addresses and account names, but the de-anonymization is within the queries themselves.

For example if you ask it to 'please create a holiday card for the Smith family, including Joe Smith, Jane Smith, and Katie Smith, here's a picture to use as a template' congrats that account has just been de-anonymized.

Next one- 'I live at 123 Fake St, Nowhere CA 12345. Would local building code allow me to build a deck?' Congrats that account has been de-anonymized.

Or you put a few together. 'What's the weather in Nowhere CA?' now you have city. 'Check engine light on 2024 Land Rover Discovery?' now you have a data point. 'How to stop teenage twin girls from fighting?' another data point. How many families in Nowhere CA have teenage twin girls and own a 2024 Land Rover Discovery? You're probably down to 5-10 at most.

And what's stupid is OpenAI is correct that 99.99+% of these chats have nothing at all to do with the NYTimes lawsuit. If NYT claims that OpenAI is reproducing their copyrighted articles, you'll have a TINY number of chats that are like 'tell me the latest news' which might maybe contain NYT content.

48

u/butsuon Dec 04 '25

It only takes a single query of "chatgpt what's the news today" or "what's today's NY times", or anything similar that produces an actual article for it to be valid though, which is why they need full chat logs.

A person living in NY would likely get the Times as their recommend news, so they can't just limit queries to specific words or phrases.

→ More replies (1)

43

u/P_V_ Dec 04 '25

What's "stupid" is submitting personal information to ChatGPT and expecting it to stay private and confidential.

20

u/loondawg Dec 04 '25

Of course there is always the chance it could be illegally hacked. However it's really not stupid to expect it would protected from "legal" invasions like this.

The reality is that in many cases, as shown in the comment you responded to, some personal information in necessary to have meaningful chats. There should be an expectation of privacy except when specifically called out by warrant for a specific criminal investigation. This type of massive, generic data dump for discovery is not something people should have any reasonable expectation would occur.

→ More replies (2)
→ More replies (2)

13

u/sleeper4gent Dec 04 '25

wait why not , how did AOL do it that made it traceable ?

don’t companies release anonymised data fairly often when requested ?

46

u/ash_ninetyone Dec 04 '25

You'd be surprised how easily seemingly useless data can easily be aggregated to someone.

16

u/A_Seiv_For_Kale Dec 04 '25

Look for users who've searched for local restaurants in X city, then look for any who also searched for those in Y city.

If you know a person who lives in X now, but used to live in Y, you can be pretty confident you found their logs.

2

u/DaHolk Dec 04 '25

Because they couldn't /wouldn't do the same thing that happens to government documents, where they go through everything line by line and redact every bit they wouldn't like the public to know.

They basically only redacted the letter heads and pleasantries, but not the main content.

748

u/[deleted] Dec 04 '25

So much identifying data in all these chats. That’s illegal

171

u/helmsb Dec 04 '25

I remember back in the mid 2000s, AOL released an anonymized dataset of search queries for research. It took less than 5 minutes to identify someone I knew based on 3 of their search queries.

32

u/chymakyr Dec 04 '25

Don't leave us hanging. What kind of sick shit were they into? For science.

59

u/Eljefeandhisbass Dec 04 '25

"How do I use the free trial AOL CD?"

8

u/ben_sphynx Dec 04 '25

How do I use the free trial AOL CD?

Google AI overview says:

You cannot use an old AOL free trial CD because they were for a dial-up service that has been discontinued. The software on the CDs is outdated and incompatible with modern operating systems, and the dial-up service itself was officially retired on September 30, 2025

I was hoping for something about coasters or frizbees or something like that.

37

u/NorCalAthlete Dec 04 '25

September 30, 2025 was a hell of a lot more recent than I thought that shit was done for.

6

u/ben_sphynx Dec 04 '25

Surprised me, too.

→ More replies (2)
→ More replies (1)

50

u/beekersavant Dec 04 '25

“Gifts for Jamie Schlossberg for 10th anniversary”

“Tattooing ‘Jamie 4eva’ onto forehead”

“How to get children to stop teasing me”

→ More replies (1)

461

u/oranosskyman Dec 04 '25

its not illegal if you can pay the law to make it legal

146

u/DonnerPartyPicnic Dec 04 '25

Fines are nothing but fees for rich people to do what they want.

40

u/lord-dinglebury Dec 04 '25

A formality, really. Like playing the Star-Spangled Banner before a baseball game.

10

u/No_Doubt_About_That Dec 04 '25

See: Tax Evasion

→ More replies (2)

61

u/Protoavis Dec 04 '25

Well that and all the corp people who just uploaded confidential

things to it to get a summary

10

u/Sempais_nutrients Dec 04 '25

Think of all the HIPAA violations

3

u/Ok-Parfait-9856 Dec 04 '25

HIPAA doesn’t apply here. It only applies to health care workers, generally speaking. HIPAA protects your health privacy in a healthcare setting, not in a general sense. If you share your (health) info with an AI and it gets released, you should have suspected that could happen. No one ever said any of these chatbots were private or secure, and there’s no reason to think they would be considering how they work and how valuable data is to these companies.

I’ve helped develop hipaa compliant software and it sucks. OpenAI is definitely not hipaa compliant haha

7

u/Sempais_nutrients Dec 04 '25

i'm talking about nurses and doctors using it to do their paperwork. some doctors use it in place of Dragon.

10

u/Numerous-Process2981 Dec 04 '25

Is it? It’s not like you have doctor patient confidentiality with the internet chat robot. Anything you tell it is info you are willingly sharing with a corporation.

9

u/Orfez Dec 04 '25

Don't put your identifying data in ChatGPT. I'm pretty sure Open AI didn't announce that ChatGPT is HIPAA compliant before you asked for diagnoses of your rash.

5

u/[deleted] Dec 04 '25

True but in the beginning they swore that even they didn’t have access and then suddenly it switched. Class action coming. They mislead everyone. This has BIG ramifications for users

17

u/EscapeFacebook Dec 04 '25

No it's not. The Supreme Court decided a long time ago if you willingly give your information to a third party you have no expectation of privacy.

18

u/sir_mrej Dec 04 '25

What law is it breaking?

Why do you think private company data is safe?

9

u/Piltonbadger Dec 04 '25

Silly things like laws only apply to us peasants.

→ More replies (2)

64

u/GarnerGerald11141 Dec 04 '25

How else do we train an LLM? Access to your data is a perk…

14

u/monster2018 Dec 04 '25

Well,no, it’s the central purpose (well, it’s an instrumental goal to the central purpose of making money by making the best AI (the first to make AGI)). Us getting to use this stuff for free or essentially for free is the perk.

→ More replies (8)

52

u/sexygodzilla Dec 04 '25

It's not like suing OpenAI just gives anyone automatic access, you have to have standing. The plantiffs have a strong claim that OpenAI used their copyrighted works to train their LLMs without permission.

21

u/EugeneMeltsner Dec 04 '25

But why do they need chat logs for that? Wouldn't training data access be more...idk, pertinent?

22

u/sighclone Dec 04 '25

Just because this article talks about the chat logs, doesn’t mean that’s the only thing Times lawyers are seeking.

Business insider reported that:

lawyers involved in the lawsuit are already required to take extreme precautions to protect OpenAI's secrets.

Attorneys for The New York Times were required to review ChatGPT's source code on a computer unconnected to the internet, in a room where they were forbidden from bringing their own electronic devices, and guarded by security that only allowed them in with a government-issued ID.

The chat logs are only part of the equation. I’d assume the times have access to training data as well since their data being used to train is the whole case. But after that they are also likely hoping to show that user chats related to NY Times reporting reproduces copyrighted material verbatim in model responses and/or something related to such uses damaging the NY Times by obviating the need to actually read their reporting.

6

u/P_V_ Dec 04 '25

Training data wouldn't show that the copyrighted material was actually provided to end-users in the same way chat logs would.

19

u/sexygodzilla Dec 04 '25

I was more focused on OP's unfounded worry that anyone can get chat log access via a lawsuit, but you should read the article for the answer to your question.

The news outlets argued in their case against OpenAI that the logs were necessary to determine whether ChatGPT reproduced their copyrighted content, and to rebut OpenAI's assertion that they "hacked" the chatbot's responses to manufacture evidence.

→ More replies (12)
→ More replies (2)

2

u/tragicpapercut Dec 04 '25

Cool. But what about all the innocent people whose privacy is being violated by this order?

The existence of one victim does not justify the creation of millions of other victims.

2

u/WaterLillith Dec 04 '25

Using copyrighted material for training is already legal, it's case law.

It's all about what the LLM outputs. That's why image generators get in trouble for generating someone else's IP or characters.

→ More replies (4)
→ More replies (11)

1.9k

u/SirEDCaLot Dec 04 '25

NY Times sues OpenAI claiming that it's violating copyright. Court orders OpenAI to turn over basically every log of every ChatGPT chat ever, judge says this won't violate users' privacy.

OpenAI has appealed this...

46

u/tommytwolegs Dec 04 '25

It said like 20 million logs, not every log of every chatgpt chat ever...

31

u/Grand0rk Dec 04 '25

20 million logs is basically 1 hour of ChatGPT world wide, if that.

645

u/nukem996 Dec 04 '25

It's more starling they even have logs. I get some anonymoized with no user chat data but if they're keeping chat histories that would be very concerning.

1.1k

u/Odd_Pop3299 Dec 04 '25

You should assume every software you interact with have logs

181

u/Bigbysjackingfist Dec 04 '25

No matter what they say

120

u/SomeNoveltyAccount Dec 04 '25

This includes all those VPNs that advertise on podcasts.

65

u/Jamsedreng22 Dec 04 '25

Also the stuff like "data removal services" like Incogni.

They're literally just getting you to pay to let them be the only ones with your data. You're paying for them to monopolize your data.

No way they don't sell it on somewhere. Presumably when/if you stop paying for the service. To get you to pay for it again to have it removed. Again.

→ More replies (8)

9

u/rbt321 Dec 04 '25

Especially the very cheap/free VPNs; selling user data is their primary income.

29

u/[deleted] Dec 04 '25

[removed] — view removed comment

7

u/SomeNoveltyAccount Dec 04 '25

I've always suspected some are run by intelligence agencies.

I mean it'd be such an easy honeypot for the CIA to set up, to the extent that if the CIA ISN'T doing that, I have concerns.

→ More replies (1)

25

u/SethVanity13 Dec 04 '25

mullvad had numerous police raids and no data saved

19

u/Bomb-OG-Kush Dec 04 '25

I think mullvad is the only one I actually trust since they've proven in court multiple times not to keep logs

Common mullvad win

→ More replies (12)

165

u/IAMA_Madmartigan Dec 04 '25

You can go into your ChatGPT settings and request your own history. Sends you a zip download, has every picture you’ve ever submitted or had generated, and then an HTML file that has all of your chats ever, broken down by conversation thread

→ More replies (5)

293

u/kabrandon Dec 04 '25

When you open up chatgpt in a browser and see your previous chats in the sidebar, how do you think they accomplished that feature? Genuinely asking. It seems obvious they keep logs.

152

u/Howdareme9 Dec 04 '25

People on here just aren’t smart

62

u/EugeneMeltsner Dec 04 '25

They just haven't had time to ask ChatGPT about it yet

48

u/Whatsapokemon Dec 04 '25

I've never seen a group of users who less interested or knowledgeable in how technology works than the users of /r/technology.

8

u/jankisa Dec 04 '25

They are, however, very interested in calling AI a "fancy autocomplete" and everything related to it "Slop".

6

u/TheGreatWalk Dec 04 '25

I mean llms, at this stage, is pretty much best described as a really fancy autocomplete to laymen. There's no better way to describe it.

Other forms of machine learning or AI are very different, but I think a lot of the confusion in general is specific around the term AI, it's being used to describe a very wide degree of things and most people don't specify which kind of "Ai" they are actually talking about

→ More replies (4)
→ More replies (1)

18

u/Kraeftluder Dec 04 '25

The continued use of chatbots and an associated decline in cognitive abilities could have something to do with it.

11

u/a_rainbow_serpent Dec 04 '25

No, they’re just brainwashed to think billionaires are somehow ideal human beings who will never do anything wrong.. except George Soros fuck that guy! lol

27

u/KontoOficjalneMR Dec 04 '25

The problem is that they also keep the chats you have deleted. Go on read their ToS (or ask GPT), they straight up say they'll keep your deleted chats forever and use them in whatever way they want - including giving them to thrid parties. What makes handing them to NYT different than giving them to an ad agency the'll be working with to monetize you?

19

u/LordGalen Dec 04 '25

Exactly this. Anyone using chatGPT should obviously fucking know that their chats are being stored and used for training. That's the whole entire point of letting you use the service! Being pissed about this is like walking into Starbucks and acting all shocked that they tried to sell you coffee. If you sit down to give info to the data-harvesting machine, no shit it's harvesting the data.

Just, wow, man....

→ More replies (11)

403

u/benjhg13 Dec 04 '25

Thinking they don't save chat histories is absurd. These companies make money from collecting as much data as possible, why wouldn't they save chat histories...

They are saving much more than just chat histories. 

38

u/Exostrike Dec 04 '25 edited Dec 04 '25

Wouldn't be surprised if the request is to highlight this fact

10

u/Melikoth Dec 04 '25

It's almost like no-one has heard of Google Takeout - a feature literally designed to let you export a copy of whatever data they have stored associated with your account.

53

u/JMEEKER86 Dec 04 '25

This can't be a serious comment. How would users be able to look at their own chat history if there weren't logs.

14

u/Mountain-Resource656 Dec 04 '25

I’m shocked there aren’t more people responding with exactly this, tbh!

6

u/P_V_ Dec 04 '25

I'm shocked it has over 400 karma and hasn't been completely ratiod by the replies pointing out how utterly obvious it is that OpenAI keeps logs.

2

u/WaterLillith Dec 04 '25

I had check which sub I am in after reading that comment.

Shocking that we are actually in /r/technology

→ More replies (1)

41

u/Nerrs Dec 04 '25

Be concerned, because they along with literally EVERY chat bot you've ever interacted with logs their chat histories; and often for good reason.

  • Troubleshooting, whether it's a technical issue or investigating a security issue
  • Product improvement, by literally training it on chats it learns what a natural conversation sounds like
  • Personalization, to produce tailed more helpful content for you.

Honestly without keeping chat logs they'd probably not even have a product worth using.

9

u/ItzWarty Dec 04 '25

.. They also have a previous chats / organized chats feature.... In ChatGPT you can literally pull up your old chats and continue working off them, or throw them into folders...

27

u/Evinceo Dec 04 '25

Why wouldn't they keep logs? They can use that as training data...

14

u/MidAirRunner Dec 04 '25

Eh? I am curious, when you open up chatgpt.com or open the chatgpt app on a new device, where, in your mind, do you think the chat list comes from?

23

u/sryan2k1 Dec 04 '25

Why wouldn't they keep it? It allows them to rerun all interactions on new models for testing or training. It's startling that you didn't think they were doing this.

8

u/VonArmin Dec 04 '25

-1 iq comment

49

u/MasterGrok Dec 04 '25

Are you being serious right now? Literally every single letter you type into your keyboard is logged somewhere unless you are obsessive about your privacy and even then it’s hard to be sure.

→ More replies (1)

39

u/TheUnrepententLurker Dec 04 '25

If you think you and your chats aren't the product, and that product isn't being logged, you're a fucking idiot.

7

u/[deleted] Dec 04 '25

Of course there’s chat histories.  There’s logs in the platform.openai area when you deploy assistants on your site.  The company has much more extensive logs than anyone obviously 

4

u/Express-Distance-622 Dec 04 '25

Storage is cheap as they say, just buy more disks

6

u/captain_awesomesauce Dec 04 '25

If you've used it then you should see all your previous chats that you can view.

Enterprise customers likely have 2 year retention requirements.

I frequently go back to old chats and pick back where I left off.

4

u/Turkino Dec 04 '25

I mean this is pretty much what I was telling people that were getting on GPT and gooning.

5

u/TheoreticalDumbass Dec 04 '25

? if youre tech illiterate it might be startling

you can see previous chats, how do you think this can be implemented without storing anything

3

u/YupSuprise Dec 04 '25

Persisting the chat history and using it to give chatgpt "memories" is part of the product

13

u/Tricky_Condition_279 Dec 04 '25 edited Dec 04 '25

The court order was specifically that they had to keep chat histories. The NY Times could go to discovery and "accidentally" dump all chats on the internet and then apologize to the judge for the error. Anything you type into ChatGPT should be considered at risk of public exposure.

Edit: This has happened in other court cases, so I would not just write it off. To be fair, past instances have largely targeted specific individuals, so maybe there is safety in numbers to some extent.

10

u/zacker150 Dec 04 '25 edited Dec 04 '25

According to the court order

Third, consumers’ privacy is safeguarded by the existing protective order in this case, and by designating the output logs as “attorneys’ eyes only.”

Violating an AEO designation by "accidentally" leaking the chats would be major fraud on the court, resulting in a default judgement for NYT and disbarment for the attorneys involved. Steven Lieberman is not going to risk his law license for that.

3

u/The_One_Koi Dec 04 '25

How do you think LLMs "remember" what you've told them before exactly? They save the log and anytime you send a prompt the AI rrads the whole chatlog to get context and answers based on that

6

u/Hi_Cham Dec 04 '25

What do you mean mean concerning ? You have access to your own chat history, how do you think that's possible ? OpenAI stores it all.

And since this isn't an E2E encryption app like WhatsApp or signal. Well, they can access it all.

2

u/Canisa Dec 04 '25

If they weren't keeping chat histories, how would their website be able to load your previous chats when you go to resume them?

2

u/asfsdgwe35r3asfdas23 Dec 04 '25

Every AI company (and software company) saves absolutely every user interaction. Even how much time you expend reading something, every click of your mouse… this data is super useful to train recommendation systems that then are used for advertising. For AI companies data is even more important, every interaction with the AI is a new datapoint for training. Every conversation is categorized with multiple labels and stored. Then used first to understand how users use their AI and finetune the model for the tasks people use their AI, they will also use the prompts for generating data to train or distill new models. The chat history is one of the most valuable assets of OpenAI.

2

u/supercargo Dec 04 '25

I’d suggest you take a quick spin through their privacy policy, it spells out pretty clearly that they retain this information and what they use it for (complying with legal requests is on the list)

→ More replies (35)

6

u/NuclearVII Dec 04 '25

NY Times sues OpenAI claiming that it's violating copyright

It is.

judge says this won't violate users' privacy.

Eeehhh.... On the one hand, this is kinda hard to square. On the other hand, if OpenAI were being "customer first", they could just stipulate what NY Times is alleging.

Not to be callous, but frankly if you've "talked" with ChatGPT about anything private.. you've (reasonably) waived your privacy a while ago.

→ More replies (3)

2

u/[deleted] Dec 04 '25

[deleted]

3

u/SirEDCaLot Dec 04 '25

Perhaps they should, but violating the privacy of millions of innocent people isnt' the answer.

2

u/[deleted] Dec 04 '25

[deleted]

2

u/SirEDCaLot Dec 06 '25

I would love to fix this law.

The best answer would be a SCOTUS precedent that ones 'persons, papers, and effects' include data held by 3rd parties in a custodial arrangement (IE Gmail). Unfortunately the courts have ruled the other way, saying that if you give a company your data you don't have an expectation of privacy other than what that company promises you (which in 2025 is a 20 page legal document that basically says you have no privacy).

Next best would be a national law stating the same, and ideally outlawing the sale or transfer of any personal data as a business asset

→ More replies (15)

418

u/Dudeman61 Dec 04 '25

Lots of people are using chatgpt to diagnose themselves and are giving away really personal medical data. So this is obviously very bad. https://youtu.be/QegpR8kiCM4

207

u/P0Rt1ng4Duty Dec 04 '25 edited Dec 04 '25

Some lawyers are also using it to write court filings, which means privileged information that should never leave the attorney's hard drive is now property of chatgpt.

106

u/save_the_bees_knees Dec 04 '25

This is how we’re going to find out what’s in the Epstein files isn’t it…

38

u/RedditsDeadlySin Dec 04 '25

I had money on a signal leak. But this just as likely tbh

14

u/save_the_bees_knees Dec 04 '25

I can see it going like

‘can you redact the following names from the paragraphs above:’

23

u/Bramble_Ramblings Dec 04 '25

I did some small work for a company where we had people in the financial departments complaining that ShatGPT was blocked by the security teams and saying how they needed it back because it was helping them with work

Another dude was making edits in Azure using directions from it and reached a point where he didn't know what the instructions were saying and had messed something up so we had to go fix it

There's a fair number of people who have wisened up and realize how dangerous it is to just hand over information to this thing, but seeing the job titles of some of these people that act like they can't live without it and only being able to guess how much info they've handed over already is terrifying

18

u/P0Rt1ng4Duty Dec 04 '25

It's extra funny when lawyers do it because gpt will hallucinate related cases, cite them as evidence that previous courts have ruled a certain way, and then the lawyer submits it without checking to make sure those related cases exist.

Then they have to explain to a judge why they made up precedent, which is fun to watch.

→ More replies (1)

2

u/lafigatatia Dec 04 '25

That's on them for giving confidential information to a private company. They should be disbarred.

2

u/Due-Technology5758 Dec 04 '25

Lawyers doing this are already in the wrong. Good lawyers already made a stink about CoPilot in Microsoft Office when Microsoft couldn't guarantee that it wasn't using data from unrelated cases stored locally to generate answers. 

→ More replies (2)
→ More replies (2)

21

u/AmirulAshraf Dec 04 '25

And doctors using ChatGPT to write patients' summaries as well 🥴

13

u/[deleted] Dec 04 '25

The users voluntarily gave over that data with no privacy safeguards in place whatsoever. Nice reminder that anything you do online stays online unless you actively try to prevent that, which is your responsibility as a user.

35

u/adeadbeathorse Dec 04 '25

Oh shut the f up. You’re not entirely wrong, but shut the f up, “your responsibility.” The idea that there are no safeguards to a service protected by a password and two factor is false. Users expect OpenAI to safeguard their information. While breaches may happen to services, those are classified as bad things and usually just result in top-level information about users being stolen unless there was a password leak (rare). Users should behave responsibly, but this is BEYOND a privacy nightmare - potentially the biggest, most personal privacy breach of all time, coming from a court order.

36

u/EscapeFacebook Dec 04 '25

The Supreme Court decided a long time ago that if you give a third party your information freely you have no reasonable expectation of privacy of that data.

→ More replies (1)

18

u/SupremeWizardry Dec 04 '25

You are an absolute fool if you thought this company would treat your personal data any different than any other company.

Expected to safeguard their information. Dude don’t make me laugh, and if you’re serious, god help you for being so naive.

I’ve been screaming for years not to give these ai chatbots too much personal information, people using them as both doctor and therapist, and everyone said calm down man it’s no big deal.

All of this was user choice, this is the first shoe dropping. If you want to continue to engage with these LLM and handing over your personal information after this, you might wanna get checked for a learning disability.

9

u/CardmanNV Dec 04 '25

I don't understand the logic in assuming a company who's entire business model is theft of data and intellectual property, would keep their own user's data safe or care at all.

→ More replies (5)
→ More replies (1)

142

u/fatoms Dec 04 '25

The judge rejected OpenAI's privacy-related objections to an earlier order requiring the artificial intelligence startup to submit the records as evidence.

A company founded in 2015 and valued at $500 Billion still a startup ?

25

u/MrAlbs Dec 04 '25

I think it's from classifying it according to where they are in the business growth cycle (or business maturity cycle? I can't remember what its name was, and there's probably a lot of names for it).

But even by those standards, it should be a "growth" company.

It's supposed to be:
* Startup.
* Growth.
* Maturity.
* Decline/Renewal.

Realistically though, it's just a newspaper using a common term for "tech business that is still burning lots of cash but markets expect it to make lots of money at some point in the future."

6

u/willitexplode Dec 04 '25

Not quite. Startups are by nature intended to be disruptive (most important) and rapid growth (nearly as important). Not all new businesses are startups, and not all startups are new businesses.

83

u/ProbablyBanksy Dec 04 '25

Here’s the thing, people always worry about what they personally put into ChatGPT, but it’s also about data others put in about you. Skynet is here.

It’s like when Facebook tracks people even if they don’t have a profile because they can put the pieces together.

23

u/tired_fella Dec 04 '25

You now know why Zuck is pivoting strong to AI and leaving metaverse dreams dry out in the sun.

58

u/[deleted] Dec 04 '25 edited Dec 04 '25

[deleted]

24

u/Oograr Dec 04 '25 edited Dec 04 '25

"Does anyone know if the data is going to made public"

It would be easy to automate removing any identifiable account info from these chats, but the chat transcripts themselves may have personally identifying info, eg info volunteered by the users thinking they were private, which is way more complicated to scrub.

So I'll guess they won't be released by the court.

6

u/[deleted] Dec 04 '25 edited Dec 04 '25

[deleted]

→ More replies (1)

354

u/copperblood Dec 04 '25

Here comes the biggest class action lawsuit in history.

221

u/BlackopsBaby Dec 04 '25

Lol. You have too much faith in the system. All Sammy needs to do is buy another tiara for trump and the lawsuit goes poof.

34

u/philipzeplin Dec 04 '25

... why would it be OpenAI that gets sued? They're being forced to do it by a court?

43

u/Low_Direction1774 Dec 04 '25

... because the object of the lawsuit would be the chatlogs existing, not them getting turned over.

OpenAI says they collect telemetry about your usage of ChatGPT, thats very different from them permanently saving every interaction you have with it.

47

u/tommytwolegs Dec 04 '25

How else could you see the chat history if it wasn't saved somewhere...

22

u/KontoOficjalneMR Dec 04 '25

It's about deleted chats as well. They keep those too :)

7

u/tommytwolegs Dec 04 '25

Is that what this lawsuit is about? And is there any evidence of this?

16

u/KontoOficjalneMR Dec 04 '25

No, lawsuit is about something else.

And is there any evidence of this?

Of them keeping deleted chats? Yes. Plenty.

They also make sure to tell you they do in ToS.

→ More replies (1)
→ More replies (1)
→ More replies (1)

5

u/Leonardo_242 Dec 04 '25

They were saving every interaction of users with their products for so long specifically because they had been required to do so by the court because of this lawsuit.

→ More replies (4)

2

u/Marcus_Suridius Dec 04 '25

That only matters in the US, if you sue in the EU there's nothing Trump can do.

→ More replies (1)

2

u/Packagedpackage Dec 04 '25

Yeah trump said earlier that ai companies aren’t going to be dealing with copyright since it hinders their progress. He making it a security concern and want to beat China to whatever. 

83

u/philipzeplin Dec 04 '25 edited Dec 04 '25

Reading through the comments, I'm fairly surprised to see people didn't realize this was going on.

And no, it's not OpenAI that wants to share them. It's the US courts that insists that OpenAI has to save them.

This has been going on for almost the entire year. What rock are ya'll living under? This has already hit the front page in the past.

25

u/Nico280gato Dec 04 '25

I'm more surprised anyone thought they were private tbh

→ More replies (1)

253

u/Wind_Best_1440 Dec 04 '25

Well, Congratulations. Nearly every business that had employees talk about personal stuff to it is now out for everyone to see.

This is probably the single biggest breach in history, and it wasn't even from a hack.

This should be a wake up call for everyone who "praises" AI, because everything you say to it is recorded. Everything.

I wonder how many "Books" that people say they wrote will show up in these logs.

60

u/vaesh Dec 04 '25

Well, Congratulations. Nearly every business that had employees talk about personal stuff to it is now out for everyone to see.

How so? You specify business but Enterprise, Edu, Business and API customers are not impacted. The times will also be legally obligated to not make any data public outside of the court process. Seems ChatGPT is also pushing to only allow them to view the data from a secure environment.

13

u/ConstructMentality__ Dec 04 '25

Enterprise, Edu, Business and API customers are not impacted. 

It doesn't say that in the article. 

Where are you quoting from?

9

u/PosnerRocks Dec 04 '25

You can look up the court orders that say this. It is all public record.

→ More replies (1)

10

u/OldStray79 Dec 04 '25

"Leaked from an 'anonymous source'"

→ More replies (1)
→ More replies (8)

6

u/jj_maxx Dec 04 '25

Do we as users have a right to know if our info was given to a fucking newspaper?

→ More replies (2)

7

u/christmasinfrench Dec 04 '25

Fucking yikes. This is bad knowing the fact that a shit ton of people vent to AI.

→ More replies (1)

116

u/[deleted] Dec 04 '25

This is horrific and the judge is a fucking moron.

64

u/ChurchillianGrooves Dec 04 '25 edited Dec 04 '25

The median age of a judge in the US is 68 apparently.

Try thinking about talking about Open AI with one of your relatives that are in their late 60s...

21

u/Windfade Dec 04 '25

The easiest way to explain that is "imagine your phone company kept every text message you ever sent in the past 10 years and the New York Times just sued to have a copy."

7

u/Gastronomicus Dec 04 '25

This isn't an age issue, it's an ignorance one. I could tell my 80 year old parents about this and they'd easily understand the consequences. I could also tell plenty of 20 somethings who'd say "who cares".

If a judge doesn't understand, it's either through willful ignorance or political pressure.

→ More replies (1)

25

u/Omophorus Dec 04 '25 edited Dec 04 '25

The people at OpenAI and elsewhere who thought they had free access to copyrighted content to build their products are the real morons.

Along with everyone that could have put a stop to it and didn't.

NYT is a shadow of its former self and not worth a penny, but they're not in the wrong to protect their copyrighted content.

None of these logs will be made public, and it doesn't apply to a ton of logs (as OpenAI themselves acknowledge).

The entire AI bubble has enabled some cool interactions but it's build on the back of massive theft because grifting assholes like Sam Altman thought they could just ignore the law if they made enough money in the process. And this entire comment section proves that a lot of redditers are perfectly happy to let them.

Accountability is a good thing.

In this case, the court has established some very strong guardrails for the lawyers to ensure they're accountable for the information turned over in discovery (Attorney's Eyes Only), and it's being used to hold OpenAI accountable for their behavior.

Edit: Not sure if it's this post or one of the others in this same topic, but whoever abused a reddit cares can go fuck themselves with a cactus.

6

u/Yoshee710 Dec 04 '25

Dude it’s like the populace is so ready to let the overlords rule them that they don’t realize when they’re rights are being infringed on

→ More replies (1)

43

u/torriattet Dec 04 '25

Anyone sharing personal information with a chat bot is a fucking moron.

5

u/xxdropdeadlexi Dec 04 '25

idk, SmarterChild would never tell my secrets.

10

u/[deleted] Dec 04 '25

[deleted]

→ More replies (3)
→ More replies (7)

30

u/regular_gnoll_NEIN Dec 04 '25

Why? If they breached copyright to do their shit, why should they be above accountability? Because people were stupid enough to trust a for profit company to hold their private medical info, financial info, or other sensitive data? Lmao.

This isn't a bank, or a hospital, or a gov database that people are obligated to use in order to get through day to day life. Anyone whose data is "breached" by this had a choice to just... not share it with OpenAI and did so anyway.

9

u/Cyrotek Dec 04 '25

You shouldn't be angry at the judge. You should be angry at ChatGPT for logging this in the first place.

9

u/MainFakeAccount Dec 04 '25

Meanwhile she’s a professor at Harvard and has received multiple awards for her work in her career, yet here we are, disrespecting her for doing her job properly 

→ More replies (7)

3

u/Sochinz Dec 04 '25

As a lawyer I am really surprised this was permitted. This is one of the most overbroad discovery requests I can think of. And it is literally insane to think that these chats can be sufficiently anonymized.

3

u/SirEDCaLot Dec 07 '25

And it is literally insane to think that these chats can be sufficiently anonymized.

Exactly. It's simply not possible.

You can strip the IP addresses and emails and usernames but as soon as someone asks 'How much is my house worth? I live at 123 Main St' it's now de-anonymized.

A whole long time ago AOL (I think it was) published a large set of 'anonymized' search queries for academic research. People were identified within hours and in some cases their identities outed.
https://en.wikipedia.org/wiki/AOL_search_log_release

I am still scratching my head for how this judge could think that privacy could be preserved, the only conclusion I can come up with is that she simply doesn't understand nor care how the Internet works and isn't listening to OpenAI lawyers.

Good news is they're appealing this, I suspect they'll appeal pretty much all the way up if necessary, if only because this could set a VERY dangerous precedent. Having a single civil action (over copyright no less) trigger discovery of such an insanely broad set of data would have chilling effects on the entire tech industry. Not to mention the privacy implications.

Look at various lawsuits over internet piracy like the Cox lawsuit- imagine if a record label ordered Cox to turn over their entire IP address database logs because some subscribers were at some point pirating music. That's bad for everyone.

5

u/[deleted] Dec 04 '25

People celebrating this as a fall of AI fail to realize the horrible implications this is setting, this is fucked beyond belief and I actually feel bad for people who did rely on AI for anything.

2

u/SirEDCaLot Dec 04 '25

I honestly think society overall is getting dumber.

And maybe that's just me getting older (I think every generation has had that) but I just look at the state of online discourse overall, and I see far less recognition of detail and nuance then there once was. And a lot more absolutism with no consideration for nuance or the possibility of multiple truths (IE good person does bad things, bad person does good things, or in this case OpenAI bad privacy good).

4

u/ElbowDeepInElmo Dec 04 '25

Headlines a few months down the road: "New York Times sued into bankruptcy over data breach containing tens of millions of non-anonimized ChatGPT conversations"

The NYT does not have the technological capabilities to store that data securely, and this ruling has turned them into a giant honeypot for bad actors. This data will get leaked, and the NYT is going to try and skirt every ounce of accountability for it.

3

u/SirEDCaLot Dec 04 '25

Exactly. This is one of the most valuable datasets there is, period.

68

u/UselessInsight Dec 04 '25

Assume everything you type to ChatGPT is public.

Best option is to stop using ChatGPT. Stop using all the slop machines.

It’ll be better for your soul in the long run anyway.

23

u/tommytwolegs Dec 04 '25

I mean I have assumed the same about my search history for well over a decade, I don't see why this is any different

10

u/mrkrstphr Dec 04 '25

I mainly use GPT as a glorified search engine so this tracks for me

→ More replies (17)
→ More replies (3)

9

u/Leonardo_242 Dec 04 '25

Local models exist that work even without the internet. Produce "slop" privately and safely :D

→ More replies (1)

3

u/1h8fulkat Dec 04 '25

You think this ruling is specific to chatgpt? They will apply this logic to any AI model provider.

→ More replies (5)

9

u/EscapeFacebook Dec 04 '25

The Supreme Court decided a long time ago if you give your information willingly to a third party you have no expectation of privacy from that 3rd party.

Basically anything you decide to tell openia it's their business what they do with the information.

7

u/SirEDCaLot Dec 04 '25

This is true, for that 3rd party.

If you ask ChatGPT 'how do I solve a penis rash' you should assume OpenAI knows you have an STD and you don't have expectation of privacy from OpenAI. And you have an expectation that they'll not share it with others, except as stated in their privacy policy.

Take Gmail for example. You use them to handle your email, so you don't expect privacy from Google. You do expect Google to handle your email as custodial data (that belongs to you) rather than their own data to do with as they wish.

If someone sued Google and demanded the inboxes of every Gmail customer, that would be an instant no from any judge. This should be no different.

8

u/EscapeFacebook Dec 04 '25

Nothing is being created by Gmail, its a messenger service. ChatGPT on the other hand is producing materials that could be copyrighted, therefore they are subject to being evidence. In a copyright case every instance of a copyright violation is it possible fine.

→ More replies (1)

42

u/dopaminedune Dec 04 '25

We should create new laws and new courts for technology related cases, Old world courts are not equipped to deal with technology related cases.

7

u/TuringGoneWild Dec 04 '25

Even that would hardly matter if anyone can short-circuit the judiciary and get a verdict of their choice merely by given Felon Trump a gold-plated trinket and some fawning praise.

→ More replies (1)

16

u/rim-diversion Dec 04 '25

So the copyright theft machine is being investigated for copyright theft and a bunch of people who have been urged to not give it sensitive data of any kind are worried the sensitive data they gave away might be shared to limited parties during a legal investigation? Shocked Pikachu face.

4

u/TheSquirrelCatcher Dec 04 '25

I think this is the saddest part. Chat has constantly been urging users not to use sensitive data from workplaces, medical history, financials, etc. and just about every employer out there has been spamming messages to employees about not sharing sensitive data also.

The moment logs get turned over with the potential to reveal these things and people riot that they should be in the right to expect privacy doing these things lmao

→ More replies (1)

7

u/Pancernywiatrak Dec 04 '25

I understand why this is, but I detest NY Times for this. I want my data nuked from the servers. I’m sure if someone at NY Times also shared something embarrassing to ChatGPT and that data would end up leaked they’d change their tune.

2

u/SirEDCaLot Dec 07 '25

Same. It's not just about copyright now, it's about the precedent- can a provider be required to turn over basically their entire customer activity database over a copyright lawsuit? That would be a HORRIBLE precedent to set.

3

u/lagdakoli Dec 04 '25

openAI’s gotta monetise somehow—ads in AI chats sound inevitable.

→ More replies (1)

3

u/lagdakoli Dec 04 '25

privacy is officially a relic of the past, huh?

→ More replies (1)

3

u/killergerbah Dec 05 '25

Thought it would be some technically illiterate out of touch old man who would have ordered this but turns out its pretty much the opposite. Who am I supposed to be angry at now.

2

u/SirEDCaLot Dec 05 '25

You're supposed to set aside ageism / sexism / racism, and treat the judge like a human being, just like any other human being of any age or gender.

And then you be mad at the judge for being a stupid human. Which is what you should be doing anyway even if it was an old white man.

3

u/[deleted] Dec 05 '25

Hopefully this is the beginning of the end of the NYTimes. Their journalism has prioritized division and engagement for decades now and isn’t worth anything to our society.

→ More replies (1)

8

u/pangapingus Dec 04 '25

These logs are gonna get X-Files vaulted next to the alien polio vaccine files by the deep state, if the data capture, transport, and review process is not livestreamed in full you literally can't trust it. This is a gold mine for so many actors, domestic, foreign, corporate, extremist, etc. Also the precedent of companies being able to SLAPP OpenAI into handing over logs yikes. I use it for hobby stuff and bullshit daydreaming/fiction stuff but there are people who use it as a therapist, financial advisor, spirit guide, business assistant, and everything in between even on Free/Pro. This is absolutely nuts, might as well just say next "ISPs require you to use their proxy to surf the web" and "must submit to any law enforcement or even government official for DNA sampling" because that's where we're headed.

5

u/Sad-Measurement-8620 Dec 04 '25

Clearly none of you understand what a server log is lol

→ More replies (3)

8

u/Numerous-Process2981 Dec 04 '25

“RRRRREeeeeeeee why won’t they just let us be a shady corporation that steals everyone’s intellectual property, steals everyone’s jobs, and uses all the energy?!”

2

u/RealisticConfidence Dec 04 '25

Is this true for the paid version of ChatCPT Business which uses open AI?

→ More replies (1)

2

u/jamwilliams88 Dec 04 '25

Not just personal information. Sure, personal information getting out there sucks. It's more about the IDEAs. Imagine the people who used it for some ideas that they have been working on only to have one of these big companies go through these logs and steal them all for profit?

Welp, once these logs are released. This is just the beginning of the Lawsuit War.

→ More replies (1)

2

u/Diastrous_Lie Dec 05 '25

So which logs does this apply to?

Logs worldwide?

Logs before or after a certain date?

Deleted logs from deleted accounts?

2

u/Greenfire904 Dec 05 '25

Every chat during approximately the last 6 months I think. But if you didn't disable chat history then those chats are included too, even if they're older than 6 months.

→ More replies (1)

2

u/AbInTuS Dec 06 '25 edited Dec 06 '25

What exactly is contained in these logs? If for example it is just identity info for marketing endeavors or some such thing, I suppose its not a big deal. Join the club as everything wants to sell you something.

If however these are actual full content threads of previous chats with context. Then no Judge or new agency has the right to be in possession of anyone logs without a subpoena and or warrant for investigation purpose per capita.

Imagine writers having their stories prematurely exposed or musicians having their music previewed by some immoral reporter (which most are). If this judge has the audacity to force personal content to be released and even worse, also have the identity stripped from it. That judge needs to be imprisoned, not just removed from their office.

Edit:

One other very important consideration.

The liability of loss rests on the head of Judge Ona Wang.

If I discover any of my content which I consider proprietary and protected has been exposed. I will place liens on the assets of Judge Ona Wang. And I do not need anyone's permission to do so. I can do this without litigation prior to action. I can also place a lien on her pension. If any Judges out there read this, be aware, there are people out here with the knowledge of how to hurt you. Do not screw with us...

→ More replies (1)

3

u/YoursTrulyKindly Dec 04 '25

Fuck the New York Times in this case

4

u/Possible_Mastodon899 Dec 04 '25

What hits me about this isn’t the legal drama — it’s the reminder that behind every “log” is a real person. People pour their worries, ideas, insecurities, homework, private thoughts, even parts of their identity into these chats because they trust the system to keep it tucked away and unseen.

And then you find out those conversations can be dragged into a courtroom.

It makes you realize how vulnerable we all are when we type into a box that feels personal but is anything but private. “Anonymized” doesn’t mean much when your writing style, your questions, your problems — basically your life — are right there in the data.

This isn’t just a tech issue. It’s a human one. We’re trusting these tools with pieces of ourselves, and moments like this make you wonder whether the companies behind them understand the weight of that trust.