Marcel Bucher is a Professor of Plant Sciences at the University of Cologne, Germany. Like many people worldwide, he was a frequent user of chatbots, with ChatGPT playing a central role in his everyday professional life. He paid for ChatGPT Plus (around £17 a month) and used it to draft emails, prepare lectures, create exams and analyse student responses. Over time, he built up roughly two years’ worth of work on the platform.

ChatGPT uses customer interactions — including prompts, uploaded files and conversations — to train and improve its models, personalise the user experience and ensure safety. Users can opt out of this in several ways, such as using “Temporary Chat”, which isn’t used for training and is automatically deleted after 30 days. Users can also export their data or manually delete chats or their entire account.
In August last year, Bucher decided to see what would happen if he stopped sharing his data with OpenAI by turning off the “Improve the model for everyone” setting in the Data Controls menu. Unfortunately, this resulted in all his chats and documents disappearing. There was no clear warning that this action would cause irreversible deletion.
Assuming it was a technical error, Bucher tried reinstalling the app and accessing his account through different browsers and devices. None of it worked. When he contacted OpenAI, he was told the data was gone permanently. Due to the company’s “privacy by design” policy, once data sharing is disabled, chat history is automatically erased — and because no backups exist, it cannot be recovered. All that remained were partial copies of work Bucher had saved elsewhere.
Unsurprisingly, Bucher was angry. He said he trusted ChatGPT as a stable workspace because it had “always been available, remembered the context of ongoing conversations and allowed [him] to retrieve and refine previous drafts”. He also pointed out that universities actively encourage the use of generative AI in teaching and research. However, he now believes tools like ChatGPT “were not developed with academic standards of reliability and accountability in mind” and should not be considered completely safe for professional use.
While many of Bucher’s criticisms are fair, it’s worth considering what ChatGPT is actually designed to do. At its core, it’s a chatbot — not a storage system. It lacks robust organisation, version history and backup infrastructure, it’s vulnerable to cyberattacks and its data is owned and managed by a private company. Dedicated storage services such as Microsoft OneDrive or AWS Cloud are built specifically to protect data, with backups and recovery systems in place. Some online reactions were less than sympathetic. One Bluesky user summed it up neatly: “Amazing sob story: ‘ChatGPT deleted all the work I hadn’t done’.” Harsh, but not entirely unreasonable — particularly given that this came from a professor with 20-year career at a university ranked 164th in the World University Rankings 2026. One might expect more foresight, preparedness and a slightly healthier approach to data management.
Ultimately, this story serves as a reminder not to treat any digital tool as something it isn’t. ChatGPT can be useful, but relying on it as a primary workspace without proper backups is a gamble — and in this case, one that didn’t pay off.
As of January 2026, OpenAI has since changed this feature. Disabling the “improve the model” option now prevents conversations from being used to train future models without automatically deleting them. A helpful fix — just a little too late for Professor Bucher.
Sources:
- Windows Central-ChatGPT erased two years of a professor’s work with one click — and there was no way back
- PCGamer-A professor lost two years of ‘carefully structured academic work’ in ChatGPT because of a single setting change: ‘These tools were not developed with academic standards of reliability in mind’
- Gizmodo-Professor Reports That OpenAI Deleted His Work, World Laughs in His Face





